WorldWideScience

Sample records for light-element r-martix analyses

  1. Contribution to the analysis of light elements using x fluorescence excited by radio-elements; Contribution a l'analyse des elements legers par fluorescence x excitee au moyen de radioelements

    Energy Technology Data Exchange (ETDEWEB)

    Robert, A [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    In order to study the possibilities of using radioactive sources for the X-fluorescence analysis of light elements, the principle is given, after a brief description of X-fluorescence, of the excitation of this phenomenon by X, {beta} and {alpha} emission from radio-elements. The operation and use of the proportional gas counter for X-ray detection is described. A device has been studied for analysing the elements of the 2. and 3. periods of the Mendeleev table. It makes it possible to excite the fluorescence with a radioactive source emitting X-rays or a particles; the X-ray fluorescence penetrates into a window-less proportional counter, this being made possible by the use of an auxiliary electric field in the neighbourhood of the sample. The gas detection pressure leading to the maximum detection yield is given. The spectra are given for the K{sub {alpha}} lines of 3. period elements excited by {sup 55}Fe, {sup 3}H/Zr and {sup 210}Po sources; for the 2. period the K{sub {alpha}} spectra of carbon and of fluorine excited by the {alpha} particles of {sup 210}Po. (author) [French] Afin d'etudier les possibilites d'emploi de sources radioactives a l'analyse par fluorescence X des elements legers, on presente apres rappel de notions generales sur la fluorescence X, le principe de l'excitation de ce phenomene par emission X, {beta}, {alpha} de radioelements. Le fonctionnement et l'utilisation du compteur proportionnel a gaz a la detection du rayonnement X est developpe. Un dispositif permettant l'analyse des elements des 2eme et 3eme periodes de la classification de Mendeleev est etudie. Il permet l'excitation de la fluorescence par source radioactive emettrice de rayons X ou de particules {alpha}; le rayonnement X de fluorescence penetre dans un compteur proportionnel depourvu de fenetre, ceci est rendu possible en creant un champ electrique auxiliaire au voisinage de l'echantillon. On definit une pression du gaz de detection pour un rendement de detection maximal

  2. Contribution to the analysis of light elements using x fluorescence excited by radio-elements; Contribution a l'analyse des elements legers par fluorescence x excitee au moyen de radioelements

    Energy Technology Data Exchange (ETDEWEB)

    Robert, A. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    In order to study the possibilities of using radioactive sources for the X-fluorescence analysis of light elements, the principle is given, after a brief description of X-fluorescence, of the excitation of this phenomenon by X, {beta} and {alpha} emission from radio-elements. The operation and use of the proportional gas counter for X-ray detection is described. A device has been studied for analysing the elements of the 2. and 3. periods of the Mendeleev table. It makes it possible to excite the fluorescence with a radioactive source emitting X-rays or a particles; the X-ray fluorescence penetrates into a window-less proportional counter, this being made possible by the use of an auxiliary electric field in the neighbourhood of the sample. The gas detection pressure leading to the maximum detection yield is given. The spectra are given for the K{sub {alpha}} lines of 3. period elements excited by {sup 55}Fe, {sup 3}H/Zr and {sup 210}Po sources; for the 2. period the K{sub {alpha}} spectra of carbon and of fluorine excited by the {alpha} particles of {sup 210}Po. (author) [French] Afin d'etudier les possibilites d'emploi de sources radioactives a l'analyse par fluorescence X des elements legers, on presente apres rappel de notions generales sur la fluorescence X, le principe de l'excitation de ce phenomene par emission X, {beta}, {alpha} de radioelements. Le fonctionnement et l'utilisation du compteur proportionnel a gaz a la detection du rayonnement X est developpe. Un dispositif permettant l'analyse des elements des 2eme et 3eme periodes de la classification de Mendeleev est etudie. Il permet l'excitation de la fluorescence par source radioactive emettrice de rayons X ou de particules {alpha}; le rayonnement X de fluorescence penetre dans un compteur proportionnel depourvu de fenetre, ceci est rendu possible en creant un champ electrique auxiliaire au voisinage de l'echantillon. On definit une pression du gaz de detection

  3. Light element thermodynamics related to actinide separations

    International Nuclear Information System (INIS)

    Johnson, I.; Johnson, C.E.

    1997-01-01

    The accumulation of waste from the last five decades of nuclear reactor development has resulted in large quantities of materials of very diverse chemical composition. An electrometallurgical (EM) method is being developed to separate the components of the waste into several unique streams suitable for permanent disposal and an actinide stream suitable for retrievable storage. The principal types of nuclear wastes are spent oxide or metallic fuel. Since the EM module requires a metallic feed, and oxygen interferes with its operation, the oxide fuel has to be reduced prior to EM treatment. Further, the wastes contain, in addition to oxygen, other light elements (first- and second-row elements) that may also interfere with the operation of the EM module. The extent that these light elements interfere with the operation of the EM module has been determined by chemical thermodynamic calculations. (orig.)

  4. Nuclear synergism of the light elements

    International Nuclear Information System (INIS)

    Harms, A.A.

    1983-05-01

    Some basic issues concerning accelerator initiated and fusion sustained nuclear energy systems are examined. For this purpose we identify selected nuclear fusion reactions characterized by a variable ion-to-neutron content and explore their intrinsic couplings and regenerative features. These are then related to particular systems concepts which emphasize fusion physics and accelerator technology. It is concluded that several light-element reaction systems possess appealing and interesting properties and can further be associated with selected advanced nuclear technologies. Their eventual implementation as nuclear energy systems requires further research in fusion physics, accelerator technology and mathematical physics. Because of the substantial potential benefits of such nuclear energy systems, it is concluded that research in this area should be pursued with much vigour. (orig.)

  5. Analysis of light elements by PIGE

    International Nuclear Information System (INIS)

    Kim, Y. S.; Choi, H. W.; Kim, D. K.; Woo, H. J.; Kim, N. B.; Park, K. S.

    2000-01-01

    The PIGE (Proton Induced Gamma ray Emission) method was applied for the measurement of light elements Li - K. A test measurement has been performed for geological, biological, environmental and material samples by using a standard sample for each element. The measurement was performed for the two proton energies of 2.4 and 3.4 MeV, and 3.4MeV was found to yield better result for multielemental analysis. The result shows a fair agreement within 15% for all elements with standard values. The detection limits of Li, B, F and Na are less than 100 ppm, while those of the other elements are from a few hundred ppm to a few percents. (author)

  6. Cosmological implications of light element abundances: theory.

    Science.gov (United States)

    Schramm, D N

    1993-06-01

    Primordial nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the hot Big Bang cosmological model (versus alternative explanations for the observed Hubble expansion). The standard homogeneous-isotropic calculation fits the light element abundances ranging from 1H at 76% and 4He at 24% by mass through 2H and 3He at parts in 105 down to 7Li at parts in 1010. It is also noted how the recent Large Electron Positron Collider (and Stanford Linear Collider) results on the number of neutrinos (Nnu) are a positive laboratory test of this standard Big Bang scenario. The possible alternate scenario of quark-hadron-induced inhomogeneities is also discussed. It is shown that when this alternative scenario is made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density (Omegab) remain approximately the same as in the standard homogeneous case, thus adding to the robustness of the standard model and the conclusion that Omegab approximately 0.06. This latter point is the driving force behind the need for nonbaryonic dark matter (assuming total density Omegatotal = 1) and the need for dark baryonic matter, since the density of visible matter Omegavisible < Omegab. The recent Population II B and Be observations are also discussed and shown to be a consequence of cosmic ray spallation processes rather than primordial nucleosynthesis. The light elements and Nnu successfully probe the cosmological model at times as early as 1 sec and a temperature (T) of approximately 10(10) K (approximately 1 MeV). Thus, they provided the first quantitative arguments that led to the connections of cosmology to nuclear and particle physics.

  7. Making Mercury's Core with Light Elements

    Science.gov (United States)

    Vander Kaaden, Kathleen E.; McCubbin, Francis M.; Ross, D. Kent

    2016-01-01

    Recent results obtained from the MErcury Surface, Space ENvironment, GEochemistry, and Ranging spacecraft showed the surface of Mercury has low FeO abundances (less than 2 wt%) and high S abundances (approximately 4 wt%), suggesting the oxygen fugacity of Mercury's surface materials is somewhere between 3 to 7 log10 units below the IW buffer. The highly reducing nature of Mercury has resulted in a relatively thin mantle and a large core that has the potential to exhibit an exotic composition in comparison to the other terrestrial planets. This exotic composition may extend to include light elements (e.g., Si, C, S). Furthermore, has argued for a possible primary floatation crust on Mercury composed of graphite, which may require a core that is C-saturated. In order to investigate mercurian core compositions, we conducted piston cylinder experiments at 1 GPa, from 1300 C to 1700 C, using a range of starting compositions consisting of various Si-Fe metal mixtures (Si5Fe95, Si10Fe90, Si22Fe78, and Si35Fe65). All metals were loaded into graphite capsules used to ensure C-saturation during the duration of each experimental run. Our experiments show that Fe-Si metallic alloys exclude carbon relative to more Fe-rich metal. This exclusion of carbon commences within the range of 5 to 10 wt% Si. These results indicate that if Mercury has a Si-rich core (having more than approximately 5 wt% silicon), it would have saturated in carbon at low C abundances allowing for the possible formation of a graphite floatation crust as suggested by. These results have important implications for the thermal and magmatic evolution of Mercury.

  8. Big bang photosynthesis and pregalactic nucleosynthesis of light elements

    International Nuclear Information System (INIS)

    Audouze, J.; Lindley, D.; Silk, J.; and Laboratoire Rene Bernas, Orsay, France)

    1985-01-01

    Two nonstandard scenarios for pregalactic synthesis of the light elements ( 2 H, 3 He, 4 He, and 7 Li) are developed. Big bang photosynthesis occurs if energetic photons, produced by the decay of massive neutrinos or gravitinos, partially photodisintegrate 4 He (formed in the standard hot big bang) to produce 2 H and 3 He. In this case, primordial nucleosynthesis no longer constrains the baryon density of the universe, or the number of neutrino species. Alternatively, one may dispense partially or completely with the hot big bang and produce the light elements by bombardment of primordial gas, provided that 4 He is synthesized by a later generation of massive stars

  9. Big bang photosynthesis and pregalactic nucleosynthesis of light elements

    Science.gov (United States)

    Audouze, J.; Lindley, D.; Silk, J.

    1985-01-01

    Two nonstandard scenarios for pregalactic synthesis of the light elements (H-2, He-3, He-4, and Li-7) are developed. Big bang photosynthesis occurs if energetic photons, produced by the decay of massive neutrinos or gravitinos, partially photodisintegrate He-4 (formed in the standard hot big bang) to produce H-2 and He-3. In this case, primordial nucleosynthesis no longer constrains the baryon density of the universe, or the number of neutrino species. Alternatively, one may dispense partially or completely with the hot big bang and produce the light elements by bombardment of primordial gas, provided that He-4 is synthesized by a later generation of massive stars.

  10. Quantitative analysis of light elements in thick samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Jesus, A.P.; Ribeiro, J.P.

    2004-01-01

    PIGE analysis of thick and intermediate samples is usually performed with the help of standards, but this method gives only good results when the standard is very similar to the sample to be analysed. In this work, we present an alternative method for PIGE analysis of light elements in thick samples. This method is based on a code that integrates the nuclear reaction excitation function along the depth of the sample. For the integration procedure the sample is divided in sublayers, defined by the energy steps that were used to measure accurately the excitation function. This function is used as input. Within each sublayer the stopping power cross-sections may be assumed as constant. With these two conditions the calculus of the contribution of each sublayer for the total yield becomes an easy task. This work presents results for the analysis of lithium, boron, fluorine and sodium in thick samples. For this purpose, excitation functions of the reactions 7 Li(p,p ' γ) 7 Li, 19 F(p,p ' γ) 19 F, 10 B(p,αγ) 7 Be and 23 Na(p,p ' γ) 23 Na were employed. Calculated γ-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds of the referred elements. The agreement is better than 7.5%. Taking into consideration the experimental uncertainty of the measured yields and the errors related to the stopping power values used, this agreement shows that effects as the beam energy straggling, ignored in the calculation, seem to play a minor role

  11. Accurate determination of light elements by charged particle activation analysis

    International Nuclear Information System (INIS)

    Shikano, K.; Shigematsu, T.

    1989-01-01

    To develop accurate determination of light elements by CPAA, accurate and practical standardization methods and uniform chemical etching are studied based on determination of carbon in gallium arsenide using the 12 C(d,n) 13 N reaction and the following results are obtained: (1)Average stopping power method with thick target yield is useful as an accurate and practical standardization method. (2)Front surface of sample has to be etched for accurate estimate of incident energy. (3)CPAA is utilized for calibration of light element analysis by physical method. (4)Calibration factor of carbon analysis in gallium arsenide using the IR method is determined to be (9.2±0.3) x 10 15 cm -1 . (author)

  12. Discontinuous functions in correction procedure for x-ray microanalysis of light elements in inorganic materials

    International Nuclear Information System (INIS)

    Kaminska, M.; Missol, W.

    2002-01-01

    A formula for absorption correction was developed and verified when multiplying it by the Love, Cox, Scott atomic number expression using the program NEWKOR and by comparison of the product with experimental and literature data. A correction error was calculated in reference to measure intensity ratios for 409 analyses of light elements (beryllium, boron, carbon, nitrogen, oxygen, fluorine) as well as 193 analyses of heavy elements (from sodium to uranium). Another computer program (MARCON) has been developed for iterative determination of elemental concentrations in the materials. (author)

  13. Twenty years of analysis of light elements at the LARN

    International Nuclear Information System (INIS)

    Demortier, G.

    1992-01-01

    We review the applications of ion beam analysis of light elements performed in the LARN during the last twenty years. The works mainly concern: helium bubbles in aluminum foils, Li in aluminum alloys, carbon in high purity MgO crystals and in olivines, nitrogen bubbles in glass and implanted nitrogen in iron and aluminum, oxygen in YBaCuO superconductors, fluorine in tooth enamel and implanted fluorine in metals. (orig.)

  14. Electronic structure of ternary hydrides based on light elements

    Energy Technology Data Exchange (ETDEWEB)

    Orgaz, E. [Departamento de Fisica y Quimica Teorica, Facultad de Quimica, Universidad Nacional Autonoma de Mexico, CP 04510 Coyoacan, Mexico, D.F. (Mexico)]. E-mail: orgaz@eros.pquim.unam.mx; Membrillo, A. [Departamento de Fisica y Quimica Teorica, Facultad de Quimica, Universidad Nacional Autonoma de Mexico, CP 04510 Coyoacan, Mexico, D.F. (Mexico); Castaneda, R. [Departamento de Fisica y Quimica Teorica, Facultad de Quimica, Universidad Nacional Autonoma de Mexico, CP 04510 Coyoacan, Mexico, D.F. (Mexico); Aburto, A. [Departamento de Fisica, Facultad de Ciencias, Universidad Nacional Autonoma de Mexico, CP 04510 Coyoacan, Mexico, D.F. (Mexico)

    2005-12-08

    Ternary hydrides based on light elements are interesting owing to the high available energy density. In this work we focused into the electronic structure of a series of known systems having the general formula AMH{sub 4}(A=Li,Na,M=B,Al). We computed the energy bands and the total and partial density of states using the linear-augmented plane waves method. In this report, we discuss the chemical bonding in this series of complex hydrides.

  15. Determination of the Light Element Fraction in MSL APXS Spectra

    Science.gov (United States)

    Perrett, G. M.; Pradler, I.; Campbell, J. L.; Gellert, R.; Leshin, L. A.; Schmidt, M. E.; Team, M.

    2013-12-01

    Additional light invisible components (ALICs), measured using the alpha particle X-ray spectrometer (APXS), represent all light elements (e.g. CO3, OH, H2O) present in a sample below Na, excluding bound oxygen. The method for quantifying ALICs was originally developed for the Mars Exploration Rover (MER) APXS (Mallet et al, 2006; Campbell et al, 2008). This method has been applied to data collected by the Mars Science Laboratory (MSL) APXS up to sol 269 using a new terrestrial calibration. ALICs are investigated using the intensity ratio of Pu L-alpha Compton and Rayleigh scatter peaks (C/R). Peak areas of the scattered X-rays are determined by the GUAPX fitting program. This experimental C/R is compared to a Monte Carlo simulated C/R. The ratio of simulated and experimental C/R values is called the K-value. ALIC concentrations are calculated by comparing the K-value to the fraction of all invisibles present; the invisible fraction is produced from the spectrum fit by GUAPX. This method is applied to MSL spectra with long integration duration (greater than 3 hours) and with energy resolution less than 180 eV at 5.9 keV. These overnight spectra encompass a variety of geologic materials examined by the Curiosity Rover, including volcanic and sedimentary lithologies. Transfer of the K-value calibration produced in the lab to the flight APXS has been completed and temperature, geometry and spectrum duration effects have been thoroughly examined. A typical limit of detection of ALICs is around 5 wt% with uncertainties of approximately 5 wt%. Accurate elemental concentrations are required as input to the Monte Carlo program (Mallet et al, 2006; Lee, 2010). Elemental concentrations are obtained from the GUAPX code using the same long duration, good resolution spectra used for determining the experimental C/R ratios (Campbell et al. 2012). Special attention was given to the assessment of Rb, Sr, and Y as these element peaks overlap the scatter peaks. Mineral effects

  16. Contribution to the study of copper and copper-arsenic archaeo-metallurgy using light element analysis and experimental fusion; Contribution a l`etude de la paleometallurgie du cuivre et du cuivre-arsenic a partir de l`analyse des elements legers et de fusions experimentales

    Energy Technology Data Exchange (ETDEWEB)

    Papillon, F

    1997-12-31

    The objective of this study is to try a direct reconstruction from ancient artefacts of the elaboration technology used in the dawning copper metallurgy. This word is based on both the light elements analysis and the carry out of the principles of physical metallurgy. However the study of an archaeological artefact necessitates the use of non destructive methods. A main aspect of this work consists in developing the most adequate metallographic technique and the methods for the determination of oxygen and carbon by ion beam analysis. Additionally experimental melting of copper and copper arsenic alloys were carried out in laboratory, under various temperature and atmosphere conditions, and `on the field` in Archeodrome de Beaune, in order to reconstruct part of the prehistorical craftsmanship. The results of measurement are consistent with our general knowledge of oxido-reduction phenomena and the behaviour of copper and copper arsenic alloys s in agreement with the prediction of thermodynamics. The nuclear analysis of three ancient artefacts showed that the oxygen and carbon contents were closer to those of the Archeodrome than those of the laboratory. Further studies of the field should consider all parameters controlling the physical-chemistry of charcoal fire. (author) 96 refs.

  17. Process to determine light elements content of steel and alloys

    Energy Technology Data Exchange (ETDEWEB)

    Quintella, Cristina M.A.L.T.M.H.; Castro, Martha T.P.O. [Universidade Federal da Bahia (IQ/UFBA), Salvador, BA (Brazil). Inst. de Quimica. LabLaser; Mac-Culloch, Joao N.L.M. [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The present work reports a process to determine qualitatively and quantitatively elements of molar mass inferior to 23 within materials, by X rays spectra associated with multivariate data analysis, or chemometric analysis. The spectra is acquired between 5 keV and 22 keV when the materials are exposed to X radiation. Here is reported the direct determination of carbon content in steel and metallic alloys. The process is more effective when using spectral regions which are not usually used. From the analysis of these spectral regions which were not considered before, it was possible to detect light elements with molar mass lower than 23, which have low capacity of absorbing and emitting radiation, but have high capacity of scattering radiation. The process here reported has the advantage that X-Ray spectra obtained are calibrated multivariately, showing high potential for development in order to be used in a portable field equipment. (author)

  18. The Effect of Neutrino Oscillations on Supernova Light Element Synthesis

    International Nuclear Information System (INIS)

    Yoshida, Takashi; Kajino, Toshitaka; Yokomakura, Hidekazu; Kimura, Keiichi; Takamura, Akira; Hartmann, Dieter H.

    2006-01-01

    We investigate light element synthesis through the ν-process during supernova explosions considering neutrino oscillations and investigate the dependence of 7Li and 11B yields on neutrino oscillation parameters mass hierarchy and θ13. The adopted supernova explosion model for explosive nucleosynthesis corresponds to SN 1987A. The 7Li and 11B yields increase by about factors of 1.9 and 1.3 in the case of normal mass hierarchy and adiabatic 13-mixing resonance compared with the case without neutrino oscillations. In the case of inverted mass hierarchy or nonadiabatic 13-mixing resonance, the increase in 7Li and 11B yields is much smaller. Astronomical observations of 7Li/11B ratio in stars formed in regions strongly affected by prior generations of supernovae would constrain mass hierarchy and the range of θ13

  19. Pumping characteristics of roots blower pumps for light element gases

    International Nuclear Information System (INIS)

    Hiroki, Seiji; Abe, Tetsuya; Tanzawa, Sadamitsu; Nakamura, Jun-ichi; Ohbayashi, Tetsuro

    2002-07-01

    The pumping speed and compression ratio of the two-stage roots blower pumping system were measured for light element gases (H 2 , D 2 and He) and for N 2 , in order to assess validity of the ITER torus roughing system as an ITER R and D task (T234). The pumping system of an Edwards EH1200 (nominal pumping speed of 1200 m 3 /s), two EH250s (ibid. 250 m 3 /s) and a backing pump (ibid. 100 m 3 /s) in series connection was tested under PNEUROP standards. The maximum pumping speeds of the two-stage system for D 2 and N 2 were 1200 and 1300 m 3 /h, respectively at 60 Hz, which satisfied the nominal pumping speed. These experimental data support the design validity of the ITER torus roughing system. (author)

  20. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  1. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  2. Neutron absorption spectroscopy for identification of light elements in actinides

    Energy Technology Data Exchange (ETDEWEB)

    Hau, I.D. [Lawrence Livermore National Laboratory, Advanced Detector Group, 7000 East Ave., L-270, Livermore, CA 94550 (United States) and Department of Nuclear Engineering, University of California Berkeley, Berkeley, CA 94720 (United States)]. E-mail: hau2@llnl.gov; Niedermayr, T.R. [Lawrence Livermore National Laboratory, Advanced Detector Group, 7000 East Ave., L-270, Livermore, CA 94550 (United States); Drury, O.B. [Lawrence Livermore National Laboratory, Advanced Detector Group, 7000 East Ave., L-270, Livermore, CA 94550 (United States); Burger, A. [Fisk University, 1000 17th Ave. North, Nashville, TN 37208 (United States); Bell, Z. [Oak Ridge National Laboratory, 1 Bethel Valley Road, Oak Ridge, TN 37831 (United States); Friedrich, S. [Lawrence Livermore National Laboratory, Advanced Detector Group, 7000 East Ave., L-270, Livermore, CA 94550 (United States)]. E-mail: friedrich1@llnl.gov

    2006-04-15

    We are developing cryogenic high-energy resolution fast-neutron spectrometers using superconducting transition-edge sensors (TES) for nuclear science and non-proliferation applications. Fast neutrons are absorbed in 94% enriched {sup 6}LiF single crystals with volumes of {approx}1 cm{sup 3} in an exothermic {sup 6}Li(n,{alpha}){sup 3}H capture reaction. The neutron energy is measured from the subsequent temperature rise with a Mo/Cu multilayer TES. Fast-neutron spectra from a {sup 252}Cf source show an energy resolution of 55 kev. Here, we discuss the instrument performance, with emphasis on the identification of light elements in actinide matrices.

  3. Photon Activation Analysis Of Light Elements Using 'Non-Gamma' Radiation Spectroscopy - The Instrumental Determination Of Phosphorus

    International Nuclear Information System (INIS)

    Segebade, Christian; Goerner, Wolf

    2011-01-01

    Unlike metal determinations the analysis of light elements (e.g., carbon, oxygen, phosphorus) is frequently problematic, in particular if analysed instrumentally. In photon activation analysis (PAA) the respective activation products do not emit gamma radiation in the most cases. Usually, annihilation quanta counting and subsequent decay curve analysis have been used for determinations of C, N, O, and F. However, radiochemical separation of the respective radioisotopes mostly is indispensable. For several reasons, some of the light elements cannot be analysed following this procedure, e.g. phosphorus. In this contribution the instrumental PAA of phosphorus in organic matrix by activation with bremsstrahlung of an electron linear accelerator and subsequent beta spectroscopy is described. The accuracy of the results was excellent as obtained by analysis of a BCR Reference Material.

  4. IR laser enrichment of light elements isotopes - challenges and prospects

    International Nuclear Information System (INIS)

    Parthasarathy, V.

    2002-01-01

    Full text: Infra-red multiple photon dissociation (IR MPD) of poly-atomic molecules has made considerable progress since its discovery in the early seventies. Since the process was found to be isotopically selective; the possibility of laser isotope separation (LIS) created a lot of initial excitement. While the early investigations were concerned with the fundamental dynamics and potential applications of the phenomenon, serious efforts for the isotope enrichment process have been made only during the last decade. These efforts focussed on aspects to improve both the enrichment factor and throughput in various systems. Many research groups have achieved a good measure of success for scaling up the process for various light elements like carbon, oxygen, silicon and sulphur whose isotopes are quite important in medicine and technology. Significant results have been reported especially for the separation of carbon isotopes wherein macroscopic operating scales have been already realised. This talk will give-a summary of our work carried out at BARC and highlight the current efforts for scaling up the process for carbon isotopes enrichment. This would include the design aspects of a large photochemical reactor with multi-pass, refocusing optics for efficient photon utilization. It will also cover the development of a cryogenic distillation set up and a preparative gas chromatograph for a large scale separation and collection of the isotopically enriched photoproduct in the post irradiation stage. Based on the experience gained and infra structure developed, plans are afoot to separate oxygen and sulphur isotopes using a similar approach

  5. Standard Cosmic Ray Energetics and Light Element Production

    CERN Document Server

    Fields, B D; Cassé, M; Vangioni-Flam, E; Fields, Brian D.; Olive, Keith A.; Casse, Michel; Vangioni-Flam, Elisabeth

    2001-01-01

    The recent observations of Be and B in metal poor stars has led to a reassessment of the origin of the light elements in the early Galaxy. At low it is metallicity ([O/H] < -1.75), it is necessary to introduce a production mechanism which is independent of the interstellar metallicity (primary). At higher metallicities, existing data might indicate that secondary production is dominant. In this paper, we focus on the secondary process, related to the standard Galactic cosmic rays, and we examine the cosmic ray energy requirements for both present and past epochs. We find the power input to maintain the present-day Galactic cosmic ray flux is about 1.5e41 erg/s = 5e50 erg/century. This implies that, if supernovae are the sites of cosmic ray acceleration, the fraction of explosion energy going to accelerated particles is about 30%, a value which we obtain consistently both from considering the present cosmic ray flux and confinement and from the present 9Be and 6Li abundances. Using the abundances of 9Be (an...

  6. Light element opacities of astrophysical interest from ATOMIC

    Energy Technology Data Exchange (ETDEWEB)

    Colgan, J.; Kilcrease, D. P.; Magee, N. H. Jr.; Armstrong, G. S. J.; Abdallah, J. Jr.; Sherrill, M. E. [Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Fontes, C. J.; Zhang, H. L.; Hakel, P. [Computational Physics Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2013-07-11

    We present new calculations of local-thermodynamic-equilibrium (LTE) light element opacities from the Los Alamos ATOMIC code for systems of astrophysical interest. ATOMIC is a multi-purpose code that can generate LTE or non-LTE quantities of interest at various levels of approximation. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. The ATOMIC code uses ab-initio atomic structure data computed from the CATS code, which is based on Cowan's atomic structure codes, and photoionization cross section data computed from the Los Alamos ionization code GIPPER. ATOMIC also incorporates a new equation-of-state (EOS) model based on the chemical picture. ATOMIC incorporates some physics packages from LEDCOP and also includes additional physical processes, such as improved free-free cross sections and additional scattering mechanisms. Our new calculations are made for elements of astrophysical interest and for a wide range of temperatures and densities.

  7. A system for on-line monitoring of light element concentration distributions in thin samples

    Energy Technology Data Exchange (ETDEWEB)

    Brands, P.J.M. E-mail: p.j.m.brands@tue.nl; Mutsaers, P.H.A.; Voigt, M.J.A. de

    1999-09-02

    At the Cyclotron Laboratory, a scanning proton microprobe is used to determine concentration distributions in biomedical samples. The data acquired in these measurements used to be analysed in a time consuming off-line analysis. To avoid the loss of valuable measurement and analysis time, DYANA was developed. DYANA is an on-line method for the analysis of data from biomedical measurements. By using a database of background shapes, light elements such as Na and Mg, can be fitted even more precisely than in conventional fitting procedures. The entire analysis takes only several seconds and is performed while the acquisition system is gathering a new subset of data. Data acquisition must be guaranteed and may not be interfered by other parallel processes. Therefore, the analysis, the data acquisition and the experiment control is performed on a PCI-based Pentium personal computer (PC), running a real-time operating system. A second PC is added to run a graphical user interface for interaction with the experimenter and the monitoring of the analysed results. The system is here illustrated using atherosclerotic tissue but is applicable to all kinds of thin samples.

  8. Evaporation Loss of Light Elements as a Function of Cooling Rate: Logarithmic Law

    Science.gov (United States)

    Xiong, Yong-Liang; Hewins, Roger H.

    2003-01-01

    Knowledge about the evaporation loss of light elements is important to our understanding of chondrule formation processes. The evaporative loss of light elements (such as B and Li) as a function of cooling rate is of special interest because recent investigations of the distribution of Li, Be and B in meteoritic chondrules have revealed that Li varies by 25 times, and B and Be varies by about 10 times. Therefore, if we can extrapolate and interpolate with confidence the evaporation loss of B and Li (and other light elements such as K, Na) at a wide range of cooling rates of interest based upon limited experimental data, we would be able to assess the full range of scenarios relating to chondrule formation processes. Here, we propose that evaporation loss of light elements as a function of cooling rate should obey the logarithmic law.

  9. Elastic recoil atomic spectroscopy of light elements with sub-nanometer depth resolution

    International Nuclear Information System (INIS)

    Kosmata, Marcel

    2011-01-01

    heavy ion irradiation. It is shown that the used energies occur both electronic sputtering and electronically induced interface mixing. Electronic sputtering is minimised by using optimised beam parameters. For most samples the effect is below the detection limit for a fluence sufficient for the analysis. However, the influence of interface mixing is so strong that it has to be included in the analysis of the layers of the depth profiles. It is concluded from these studies that at the Rossendorf 5 MV tandem accelerator chlorine ions with an energy of 20 MeV deliver the best results. In some cases, such as the analysis of boron, the energy must be reduced to 6.5 MeV in order to retain the electronic sputtering below the detection limit. The fourth focus is the study of the influence of specific sample properties, such as surface roughness, on the shape of a measured energy spectra and respectively on the analysed depth profile. It is shown that knowledge of the roughness of a sample at the surface and at the interfaces for the analysis is needed. In addition, the contribution parameters limiting the depth resolution are calculated and compared with the conventional ion beam analysis. Finally, a comparison is made between the highresolution ion beam analysis and complementary methods published by other research groups. The fifth and last focus is the analysis of light elements in ultra thin layers. All models presented in this thesis to reduce the influence of beam damage are taken into account. The dynamic non-equilibrium charge state is also included for the quantification of elements. Depth profiling of multilayer systems is demonstrated for systems consisting of SiO 2 -Si 3 N 4 O x -SiO 2 on silicon, boron implantation profiles for ultra shallow junctions and ultra thin oxide layers, such as used as high-k materials.

  10. Contribution to the analysis of light elements using x fluorescence excited by radio-elements

    International Nuclear Information System (INIS)

    Robert, A.

    1964-01-01

    In order to study the possibilities of using radioactive sources for the X-fluorescence analysis of light elements, the principle is given, after a brief description of X-fluorescence, of the excitation of this phenomenon by X, β and α emission from radio-elements. The operation and use of the proportional gas counter for X-ray detection is described. A device has been studied for analysing the elements of the 2. and 3. periods of the Mendeleev table. It makes it possible to excite the fluorescence with a radioactive source emitting X-rays or a particles; the X-ray fluorescence penetrates into a window-less proportional counter, this being made possible by the use of an auxiliary electric field in the neighbourhood of the sample. The gas detection pressure leading to the maximum detection yield is given. The spectra are given for the K α lines of 3. period elements excited by 55 Fe, 3 H/Zr and 210 Po sources; for the 2. period the K α spectra of carbon and of fluorine excited by the α particles of 210 Po. (author) [fr

  11. Summary report of the consultants' meeting on improvement of the standard cross sections for light elements

    International Nuclear Information System (INIS)

    Carlson, A.D.; Muir, D.W.; Pronyaev, V.G.

    2001-06-01

    This report summarizes the results of the Consultants' Meeting on Improvement of the Standard Cross Sections for Light Elements. The approaches and computer programs used for evaluation of neutron standard cross sections and their uncertainties were presented by the participants. Special attention was paid to the reasons for strong uncertainty reduction observed in the model fits. The meeting participants discussed the plan of the INDC recommended Co-ordinated Research Project (CRP) on 'Improvement of the Standard Cross Sections for Light Elements'. This CRP will address the problem of uncertainty reduction along with other methodological improvements needed in order to produce a new, and internationally accepted, evaluation of neutron standard cross sections for light elements. (author)

  12. Experimental constraints on light elements in the Earth’s outer core

    OpenAIRE

    Youjun Zhang; Toshimori Sekine; Hongliang He; Yin Yu; Fusheng Liu; Mingjian Zhang

    2016-01-01

    Earth?s outer core is liquid and dominantly composed of iron and nickel (~5?10?wt%). Its density, however, is ~8% lower than that of liquid iron, and requires the presence of a significant amount of light element(s). A good way to specify the light element(s) is a direct comparison of density and sound velocity measurements between seismological data and those of possible candidate compositions at the core conditions. We report the sound velocity measurements of a model core composition in th...

  13. The light element formation: a signature of high energy nuclear astrophysics

    International Nuclear Information System (INIS)

    Audouze, J.; Meneguzzi, M.; Reeves, H.

    1976-01-01

    Light elements D, 6 Li, 9 Be, 10 B and 11 B (and possibly also 7 Li) are not produced by the general nucleosynthetic processes occurring in stars. They appear to be synthesized by high energy processes occuring either during the interaction of galactic cosmic rays with the interstellar medium or in supernovae envelopes. These formation processes are discussed. It is emphasized that the most coherent scenario regarding the formation of the light elements is obtained by taking also into account the nuclear processes which may have occurred during hot phases of the early Universe (Big Bang). Implications on chemical evolution of galaxies and on cosmology are briefly recalled. (Auth.)

  14. Testing the Big Bang: Light elements, neutrinos, dark matter and large-scale structure

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    Several experimental and observational tests of the standard cosmological model are examined. In particular, a detailed discussion is presented regarding: (1) nucleosynthesis, the light element abundances, and neutrino counting; (2) the dark matter problems; and (3) the formation of galaxies and large-scale structure. Comments are made on the possible implications of the recent solar neutrino experimental results for cosmology. An appendix briefly discusses the 17 keV thing and the cosmological and astrophysical constraints on it.

  15. Some studies on light-element analysis with particle accelerators in South Africa

    International Nuclear Information System (INIS)

    Peisach, M.; Pillay, A.E.

    1993-01-01

    The analysis of elements in the range 1 ≤ Z ≤ is often difficult by most contemporary methods. The practical utility of various ion-beam techniques on light-element analysis is examined, and a comprehensive evaluation of the major analytical aspects on determinations of this nature, with particular reference to work done in South Africa, is provided. 40 refs., 8 figs., 3 tabs

  16. PIXE and light element analysis (C,N) in glass inclusions trapped in meteorites with the nuclear microprobe

    International Nuclear Information System (INIS)

    Varela, M.E.; Mosbah, M.; Metrich, N.; Duraud, J.P.; Kurat, G.

    1999-01-01

    Proton-induced X-ray emission (PIXE) and light element analysis have been performed with the nuclear microprobe at the Laboratoire Pierre Suee (Saclay-France) in glass inclusions of the carbonaceous chondrites: Allende, Kaba and Renazzo, and in the achondrite meteorite: Chassigny. Carbon contents in olivine of chondrules are below the nuclear reactions analysis (NRA) detection limit, however, glasses from glass inclusions hosted by these grains, contain an appreciable and highly variable quantities of carbon (200-1600 ppm). This could indicate variable amounts of C trapped during glass inclusion formation. On the other hand, nitrogen is present in highly variable amounts in glasses of both, chondrites and achondrites minerals. Its abundance, correlated with depth from the section surface which suggests loss of N during analyses and therefore the possible existence of a very mobile (volatile?) species. A chondritic Rb/Sr and K/Rb ratio obtained by PIXE analyses in the glass-bearing inclusions of the Chassigny meteorite points towards a primitive source for the glass precursor of Chassigny inclusions

  17. Testing the big bang: Light elements, neutrinos, dark matter and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (United States) Fermi National Accelerator Lab., Batavia, IL (United States))

    1991-06-01

    In this series of lectures, several experimental and observational tests of the standard cosmological model are examined. In particular, detailed discussion is presented regarding nucleosynthesis, the light element abundances and neutrino counting; the dark matter problems; and the formation of galaxies and large-scale structure. Comments will also be made on the possible implications of the recent solar neutrino experimental results for cosmology. An appendix briefly discusses the 17 keV thing'' and the cosmological and astrophysical constraints on it. 126 refs., 8 figs., 2 tabs.

  18. Study on the state of a surface of compounds of vanadium with light elements

    International Nuclear Information System (INIS)

    Povstugar, V.I.; Mikhajlova, S.S.; Trapeznikov, V.A.

    1976-01-01

    Roentgenoelectron study of powderlike compounds of vanadium with light elements (C, N, O, S) was carried out. The study was made in the temperature range of 70-500 deg C. The results were obtained in an electron magnetic spectrometer. Spectra of inner levels O 1S and V 2p and valance bands are presented. The experimental results can be employed for the study of synthesis problems of the given class of compounds. Due to high surface activity the study of catalytic properties of finely dispersed vanadium compounds by roentgenoelectron spectroscopy method gives much information about surface processes

  19. Influence of precipitating light elements on stable stratification below the core/mantle boundary

    Science.gov (United States)

    O'Rourke, J. G.; Stevenson, D. J.

    2017-12-01

    Stable stratification below the core/mantle boundary is often invoked to explain anomalously low seismic velocities in this region. Diffusion of light elements like oxygen or, more slowly, silicon could create a stabilizing chemical gradient in the outermost core. Heat flow less than that conducted along the adiabatic gradient may also produce thermal stratification. However, reconciling either origin with the apparent longevity (>3.45 billion years) of Earth's magnetic field remains difficult. Sub-isentropic heat flow would not drive a dynamo by thermal convection before the nucleation of the inner core, which likely occurred less than one billion years ago and did not instantly change the heat flow. Moreover, an oxygen-enriched layer below the core/mantle boundary—the source of thermal buoyancy—could establish double-diffusive convection where motion in the bulk fluid is suppressed below a slowly advancing interface. Here we present new models that explain both stable stratification and a long-lived dynamo by considering ongoing precipitation of magnesium oxide and/or silicon dioxide from the core. Lithophile elements may partition into iron alloys under extreme pressure and temperature during Earth's formation, especially after giant impacts. Modest core/mantle heat flow then drives compositional convection—regardless of thermal conductivity—since their solubility is strongly temperature-dependent. Our models begin with bulk abundances for the mantle and core determined by the redox conditions during accretion. We then track equilibration between the core and a primordial basal magma ocean followed by downward diffusion of light elements. Precipitation begins at a depth that is most sensitive to temperature and oxygen abundance and then creates feedbacks with the radial thermal and chemical profiles. Successful models feature a stable layer with low seismic velocity (which mandates multi-component evolution since a single light element typically

  20. Transportation of natural radionuclides and rare earth light elements in the lagoon system of Buena, RJ

    International Nuclear Information System (INIS)

    Lauria, Dejanira da Costa

    1999-03-01

    it was investigated the transport of the series natural radionuclides and the earth rare light elements in a coastal lagoon system, located in a monazite rich region, in the coast north region of Rio de Janeiro state. The lagoon water showed off abnormal concentrations of radium isotopes and of the earth rare light elements (ERLEs). The longitudinal gradient of the Ra, of the ERLEs and of the major ion concentration's, whose data were obtained during two and half years of the research at the place, and the statistical analysis pointed to two mainly source as responsible for the water lagoon composition - the marine and the underground waters. The underground water supplies the radionuclides and ERLEs, possibly originated by monazite lixiviation. Based on the water speciation modeling, the results of laboratory adsorption on sediment experiments and the sediment characterization, the behavior of the radio isotopes, the ERLEs, U, Th e Pb-210, along of the lagoon, are discussed. It is also discussed the role of the aquatic macrophyte Typha dominguesis Pers in the nuclide uptake and the following liberation. (author)

  1. Excitation Functions for Charged Particle Induced Reactions in Light Elements at Low Projectile Energies

    International Nuclear Information System (INIS)

    Lorenzen, J.; Brune, D.

    1973-01-01

    The present chapter has been formulated with the aim of making it useful in various fields of nuclear applications with emphasis on charged particle activation analysis. Activation analysis of light elements using charged particles has proved to be an important tool in solving various problems in analytical chemistry, e g those associated with metal surfaces. Scientists desiring to evaluate the distribution of light elements in the surface of various matrices using charged particle reactions require accurate data on cross sections in the MeV-region. A knowledge of cross section data and yield-functions is of great interest in many applied fields involving work with charged particles, such as radiological protection and health physics, material research, semiconductor material investigations and corrosion chemistry. The authors therefore decided to collect a limited number of data which find use in these fields. Although the compilation is far from being complete, it is expected to be of assistance in devising measurements of charged particle reactions in Van de Graaff or other low energy accelerators

  2. Light element abundances in a matter-antimatter model of the universe

    International Nuclear Information System (INIS)

    Aly, J.J.

    1978-01-01

    This paper is devoted to the problem of light element synthesis in a baryon symmetric Big-Bang cosmology, in which the universe is constituted at the end of the leptonic era by a nucleon-antinucleon emulsion. If the initial typical size of the matter or antimatter regions is sufficiently high to avoid significant neutron annihilation, nucleosynthesis can proceed in this kind of model in the same way as in the conventional Big-Bang. But the abundances of the created light elements can be modified at a later time by interaction of the nuclei with the high energy particles and photons resulting from annihilation. In this article, we consider two specific mechanisms able to change the abundances: a 4 He 'nucleodisruption' process (proposed by Combes et al., 1975), which leads to deuterium production, and 4 He photodisintegration by annihilation γ-rays, which leads to an increase of the 3 He and D production. General relations are established which allow one to compute the abundances of the so created elements when the size l of the matter or antimatter regions and the annihilation rate are given as function of time. These relations are applied to the Omnes model, in which the size l grows by a coalescence mechanism. It is shown that in this model the D and 3 He abundances are much greater than the limits on primordial abundances deduced from the present observations. (orig.) [de

  3. Excitation Functions for Charged Particle Induced Reactions in Light Elements at Low Projectile Energies

    Energy Technology Data Exchange (ETDEWEB)

    Lorenzen, J; Brune, D

    1973-07-01

    The present chapter has been formulated with the aim of making it useful in various fields of nuclear applications with emphasis on charged particle activation analysis. Activation analysis of light elements using charged particles has proved to be an important tool in solving various problems in analytical chemistry, e g those associated with metal surfaces. Scientists desiring to evaluate the distribution of light elements in the surface of various matrices using charged particle reactions require accurate data on cross sections in the MeV-region. A knowledge of cross section data and yield-functions is of great interest in many applied fields involving work with charged particles, such as radiological protection and health physics, material research, semiconductor material investigations and corrosion chemistry. The authors therefore decided to collect a limited number of data which find use in these fields. Although the compilation is far from being complete, it is expected to be of assistance in devising measurements of charged particle reactions in Van de Graaff or other low energy accelerators

  4. Identification of light elements in silicon nitride by aberration-corrected scanning transmission electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Idrobo, Juan C., E-mail: idrobojc@ornl.gov [Materials Science and Technology Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Walkosz, Weronika [Materials Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Department of Physics, University of Illinois at Chicago, Chicago, IL 60607 (United States); Klie, Robert F.; Oeguet, Serdar [Department of Physics, University of Illinois at Chicago, Chicago, IL 60607 (United States)

    2012-12-15

    In silicon nitride structural ceramics, the overall mechanical and thermal properties are controlled by the atomic and electronic structures at the interface between the ceramic grains and the amorphous intergranular films (IGFs) formed by various sintering additives. In the last ten years the atomic arrangements of heavy elements (rare-earths) at the Si{sub 3}N{sub 4}/IGF interfaces have been resolved. However, the atomic position of light elements, without which it is not possible to obtain a complete description of the interfaces, has been lacking. This review article details the authors' efforts to identify the atomic arrangement of light elements such as nitrogen and oxygen at the Si{sub 3}N{sub 4}/SiO{sub 2} interface and in bulk Si{sub 3}N{sub 4} using aberration-corrected scanning transmission electron microscopy. -- Highlights: Black-Right-Pointing-Pointer Revealing the atomic structure of the {alpha}-Si{sub 3}N{sub 4}/SiO{sub 2} interface. Black-Right-Pointing-Pointer Identification and lattice location of oxygen impurities in bulk {alpha}-Si{sub 3}N{sub 4}. Black-Right-Pointing-Pointer Short range ordering of nitrogen and oxygen at the {beta}-Si{sub 3}N{sub 4}/SiO{sub 2} interface.

  5. PREFACE: Light element atom, molecule and radical behaviour in the divertor and edge plasma regions

    Science.gov (United States)

    Braams, Bastiaan J.; Chung, Hyun-Kung

    2015-01-01

    This volume of Journal of Physics: Conference Series contains contributions by participants in an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on "Light element atom, molecule and radical behaviour in the divertor and edge plasma regions" (in magnetic fusion devices). Light elements are the dominant impurity species in fusion experiments and in the near-wall plasma they occur as atoms or ions and also as hydrides and other molecules and molecular ions. Hydrogen (H or D, and T in a reactor) is the dominant species in fusion experiments, but all light elements He - O and Ne are of interest for various reasons. Helium is a product of the D+T fusion reaction and is introduced in experiments for transport studies. Lithium is used for wall coating and also as a beam diagnostic material. Beryllium is foreseen as a wall material for the ITER experiment and is used on the Joint European Torus (JET) experiment. Boron may be used as a coating material for the vessel walls. Carbon (graphite or carbon-fiber composite) is often used as the target material for wall regions subject to high heat load. Nitrogen may be used as a buffer gas for edge plasma cooling. Oxygen is a common impurity in experiments due to residual water vapor. Finally, neon is another choice as a buffer gas. Data for collisional and radiative processes involving these species are important for plasma modelling and for diagnostics. The participants in the CRP met 3 times over the years 2009-2013 for a research coordination meeting. Reports and presentation materials for these meetings are available through the web page on coordinated research projects of the (IAEA) Atomic and Molecular Data Unit [1]. Some of the numerical data generated in the course of the CRP is available through the ALADDIN database [2]. The IAEA takes the opportunity to thank the participants in the CRP for their dedicated efforts in the course of the CRP and for their contributions to this volume. The IAEA

  6. Desorption of large molecules with light-element clusters: Effects of cluster size and substrate nature

    Energy Technology Data Exchange (ETDEWEB)

    Delcorte, Arnaud, E-mail: arnaud.delcorte@uclouvain.be [Institute of Condensed Matter and Nanosciences - Bio and Soft Matter, Universite catholique de Louvain, Croix du Sud, 1 bte 3, B-1348 Louvain-la-Neuve (Belgium); Garrison, Barbara J. [Department of Chemistry, Penn State University, University Park, PA 16802 (United States)

    2011-07-15

    This contribution focuses on the conditions required to desorb a large hydrocarbon molecule using light-element clusters. The test molecule is a 7.5 kDa coil of polystyrene (PS61). Several projectiles are compared, from C{sub 60} to 110 kDa organic droplets and two substrates are used, amorphous polyethylene and mono-crystalline gold. Different aiming points and incidence angles are examined. Under specific conditions, 10 keV nanodrops can desorb PS61 intact from a gold substrate and from a soft polyethylene substrate. The prevalent mechanism for the desorption of intact and 'cold' molecules is one in which the molecules are washed away by the projectile constituents and entrained in their flux, with an emission angle close to {approx}70 deg. The effects of the different parameters on the dynamics and the underlying physics are discussed in detail and the predictions of the model are compared with other published studies.

  7. Desorption of large molecules with light-element clusters: Effects of cluster size and substrate nature

    International Nuclear Information System (INIS)

    Delcorte, Arnaud; Garrison, Barbara J.

    2011-01-01

    This contribution focuses on the conditions required to desorb a large hydrocarbon molecule using light-element clusters. The test molecule is a 7.5 kDa coil of polystyrene (PS61). Several projectiles are compared, from C 60 to 110 kDa organic droplets and two substrates are used, amorphous polyethylene and mono-crystalline gold. Different aiming points and incidence angles are examined. Under specific conditions, 10 keV nanodrops can desorb PS61 intact from a gold substrate and from a soft polyethylene substrate. The prevalent mechanism for the desorption of intact and 'cold' molecules is one in which the molecules are washed away by the projectile constituents and entrained in their flux, with an emission angle close to ∼70 deg. The effects of the different parameters on the dynamics and the underlying physics are discussed in detail and the predictions of the model are compared with other published studies.

  8. Development of SEM/STEM-WDX for highly sensitive detection of light elements

    Science.gov (United States)

    Anan, Y.; Koguchi, M.; Kimura, T.; Sekiguchi, T.

    2018-02-01

    In this study, to detect the light element lithium (Li) and to detect low dosed Boron (B) in the local area at nm order, we developed an analytical electron microscope equipped with an improved serial (S)-type WDX (wavelength dispersive X-ray spectroscopy) system. In detail, to detect Li, we developed a high-conductivity multi-capillary X-ray (MCX) lens, and a diffractor with a lattice spacing (d) of 15 nm, and with a spacing variation (δ d) of 0.8 nm. Moreover, to detect low dosed light element B, we designed a high-conductivity MCX lens based on the soft X-ray reflectivity in the capillary and calculation. We developed a large-solid-angle MCX lens whose conductivity of the characteristic X-rays of B became 20 times higher than that of an MCX lens with a 30-mm focal length. Our developed analytical electron microscope was applied to a LiAl specimen and a low B-doped Si substrate specimen, and the performance of this analytical electron microscope was evaluated. As a results, this analytical electron microscope could detect the characteristic X-rays of Li with a minimum mass fraction (MMF) of 8.4 atomic % (at. %). The energy resolution was 1 eV at 55 eV. From the results of measuring the line profile of B for the unpatterned B-implantation area on a B-doped Si substrate specimen, the measured line profile data were in good agreement with secondary ion mass spectrometry data up to a depth of 100 nm with a B concentration of 0.05 at. %.

  9. Massive black holes and light-element nucleosynthesis in a baryonic universe

    Science.gov (United States)

    Gnedin, Nickolay Y.; Ostriker, Jeremiah P.; Rees, Martin J.

    1995-01-01

    We reexamine the model proposed by Gnedin & Ostriker (1992) in which Jeans mass black holes (M(sub BH) approximately = 10(exp 6) solar mass) form shortly after decoupling. There is no nonbaryonic dark matter in this model, but we examine the possibility that Omega(sub b) is considerably larger than given by normal nucleosynthesis. Here we allow for the fact that much of the high baryon-to-photon ratio material will collapse leaving the universe of remaining material with light-element abundances more in accord with the residual baryonic density (approximately = 10(exp -2)) than with Omega(sub 0) and the initial baryonic density (approximately = 10(exp -1)). We find that no reasonable model can be made with random-phase density fluctuations, if the power on scales smaller than 10(exp 6) solar mass is as large as expected. However, phase-correlated models of the type that might occur in connection with topological singularities can be made with Omega(sub b) h(exp 2) = 0.013 +/- 0.001, 0.15 approximately less than Omega(sub 0) approximately less than 0.4, which are either flat (Omega(sub lambda) = 1 - Omega(sub 0)) or open (Omega(sub lambda) = 0) and which satisfy all the observational constraints which we apply, including the large baryon-to-total mass ratio found in the X-ray clusters. The remnant baryon density is thus close to that obtained in the standard picture (Omega(sub b) h(exp 2) = 0.0125 +/- 0.0025; Walker et al. 1991). The spectral index implied for fluctuations in the baryonic isocurvature scenario, -1 less than m less than 0, is in the range expected by other arguments based on large-scale structure and microwave fluctuation constraints. The dark matter in this picture is in the form of massive black holes. Accretion onto them at early epochs releases high-energy photons which significantly heat and reionize the universe. But photodissociation does not materially change light-element abundances. A typical model gives bar-y approximately = 1 x 10(exp -5

  10. LIGHT-ELEMENT ABUNDANCE VARIATIONS AT LOW METALLICITY: THE GLOBULAR CLUSTER NGC 5466

    International Nuclear Information System (INIS)

    Shetrone, Matthew; Martell, Sarah L.; Wilkerson, Rachel; Adams, Joshua; Siegel, Michael H.; Smith, Graeme H.; Bond, Howard E.

    2010-01-01

    We present low-resolution (R ≅850) spectra for 67 asymptotic giant branch (AGB), horizontal branch, and red giant branch (RGB) stars in the low-metallicity globular cluster NGC 5466, taken with the VIRUS-P integral-field spectrograph at the 2.7 m Harlan J. Smith telescope at McDonald Observatory. Sixty-six stars are confirmed, and one rejected, as cluster members based on radial velocity, which we measure to an accuracy of 16 km s -1 via template-matching techniques. CN and CH band strengths have been measured for 29 RGB and AGB stars in NGC 5466, and the band-strength indices measured from VIRUS-P data show close agreement with those measured from Keck/LRIS spectra previously taken for five of our target stars. We also determine carbon abundances from comparisons with synthetic spectra. The RGB stars in our data set cover a range in absolute V magnitude from +2 to -3, which permits us to study the rate of carbon depletion on the giant branch as well as the point of its onset. The data show a clear decline in carbon abundance with rising luminosity above the luminosity function 'bump' on the giant branch, and also a subdued range in CN band strength, suggesting ongoing internal mixing in individual stars but minor or no primordial star-to-star variation in light-element abundances.

  11. Simulation of bonding effects in HRTEM images of light element materials

    Directory of Open Access Journals (Sweden)

    Simon Kurasch

    2011-07-01

    Full Text Available The accuracy of multislice high-resolution transmission electron microscopy (HRTEM simulation can be improved by calculating the scattering potential using density functional theory (DFT. This approach accounts for the fact that electrons in the specimen are redistributed according to their local chemical environment. This influences the scattering process and alters the absolute and relative contrast in the final image. For light element materials with well defined geometry, such as graphene and hexagonal boron nitride monolayers, the DFT based simulation scheme turned out to be necessary to prevent misinterpretation of weak signals, such as the identification of nitrogen substitutions in a graphene network. Furthermore, this implies that the HRTEM image does not only contain structural information (atom positions and atomic numbers. Instead, information on the electron charge distribution can be gained in addition.In order to produce meaningful results, the new input parameters need to be chosen carefully. Here we present details of the simulation process and discuss the influence of the main parameters on the final result. Furthermore we apply the simulation scheme to three model systems: A single atom boron and a single atom oxygen substitution in graphene and an oxygen adatom on graphene.

  12. Light element production by cosmological cosmic rays and the gamma-ray background

    International Nuclear Information System (INIS)

    Montmerle, T.

    1977-01-01

    This paper examines the view that the 1-100 MeV γ-ray background is of cosmological origin, and is produced by high-energy collisions in a burst at high redshifts (approximately 100) between cosmic rays and the ambient gas, as suggested by Stecker (1969). To test this 'cosmological cosmic-ray (CCR) hypothesis', use is made of the fact that, simultaneously, low energy interactions give birth to the light elements D, 3 He, 6 Li, 7 Li and 7 Be. Their resulting abundances are calculated by normalizing the CCR flux to the observed γ-ray background. Since it is possible to find the correct (observed) 7 Li abundance, which is otherwise unexplained as yet, by this process, it is of interest to discuss the various uncertainties involved in the calculations. Among these, the spread of the present γ-ray data, especially between 1 and approximately 10 MeV, is a major uncertainty, and emphasis is put on its influence on the results and, as a consequence, on the validity of the CCR hypothesi

  13. The α-induced thick-target γ-ray yield from light elements

    Energy Technology Data Exchange (ETDEWEB)

    Heaton, R. K. [Queen`s Univ., Kingston, ON (Canada). Dept. of Physics

    1994-10-01

    The α-induced thick-target γ-ray yield from light elements has been measured in the energy range 5.6 MeV ≤ Eα ≤ 10 MeV. The γ-ray yield for > 2.1 MeV from thick targets of beryllium, boron nitride, sodium fluoride, magnesium, aluminum and silicon were measured using the α-particle beam from the Lawrence Berkeley Laboratories 88 in. cyclotron. The elemental yields from this experiment were used to construct the α-induced direct production γ-ray spectrum from materials in the SNO detector, a large volume ultra-low background neutrino detector located in the Creighton mine near Sudbury, Canada. This background source was an order of magnitude lower than predicted by previous calculations. These measurements are in good agreement with theoretical calculations of this spectrum based on a statistical nuclear model of the reaction, with the gross high energy spectrum structure being reproduced to within a factor of two. Detailed comparison of experimental and theoretical excitation population distribution of several residual nuclei indicate the same level of agreement within experimental uncertainties.

  14. Neutron energy spectra produced by α-bombardment of light elements in thick targets

    International Nuclear Information System (INIS)

    Jacobs, G.J.H.

    1982-01-01

    The aim of the work, presented in this thesis, is to determine energy spectra of neutrons produced by α-particle bombardment of thick targets containing light elements. These spectra are required for nuclear waste management. The set-up of the neutron spectrometer is described, and its calibration discussed. Absolute efficiencies were determined at various neutron energies, using monoenergetic neutrons produced with the Van de Graaff accelerator in pulsed mode. The additional calibration of the neutron spectrometer as proton-recoil spectrometer was carried out primarily for future applications in measurements where no pulsed neutron source is available or the neutron flux density is too low. The basis for an accurate uncertainty analysis is made by the determination of the covariance matrix for the uncertainties in the efficiencies. The determination of the neutron energy spectra from time-of-flight and from proton-recoil measurements is described. A comparison of the results obtained from the two different types of measurements is made. The experimentally determined spectra were compared with spectra calculated from stopping powers and theoretically determined cross sections. These cross sections were calculated from optical model parameters and level parameters using the Hauser-Feshbach formalism. Measurements were carried out on thick targets of silicon, aluminium, magnesium, carbon, boron nitride, calcium fluoride, aluminium oxide, silicon oxide and uranium oxide at four different α-particle energies. (Auth.)

  15. A code for quantitative analysis of light elements in thick samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Jesus, A.P.; Ribeiro, J.P.

    2005-01-01

    This work presents a code developed for the quantitative analysis of light elements in thick samples by PIGE. The new method avoids the use of standards in the analysis, using a formalism similar to the one used for PIXE analysis, where the excitation function of the nuclear reaction related to the gamma-ray emission is integrated along the depth of the sample. In order to check the validity of the code, we present results for the analysis of Lithium, Boron, Fluorine and Sodium in thick samples. For this purpose, the experimental values of the excitation functions of the reactions 7 Li(p,p'γ) 7 Li, 10 B(p,αγ) 7 Be, 19 F(p,p'γ) 19 F and 23 Na(p,p'γ) 23 Na were used as input. For stopping power cross-sections calculations the semi-empirical equations of Ziegler et al. and the Bragg's rule were used. Agreement between the experimental and the calculated gamma-ray yields was always better than 7.5%

  16. Nonempirical Calculation of Superconducting Transition Temperatures in Light-Element Superconductors.

    Science.gov (United States)

    Arita, Ryotaro; Koretsune, Takashi; Sakai, Shiro; Akashi, Ryosuke; Nomura, Yusuke; Sano, Wataru

    2017-07-01

    Recent progress in the fully nonempirical calculation of the superconducting transition temperature (T c ) is reviewed. Especially, this study focuses on three representative light-element high-T c superconductors, i.e., elemental Li, sulfur hydrides, and alkali-doped fullerides. Here, it is discussed how crucial it is to develop the beyond Migdal-Eliashberg (ME) methods. For Li, a scheme of superconducting density functional theory for the plasmon mechanism is formulated and it is found that T c is dramatically enhanced by considering the frequency dependence of the screened Coulomb interaction. For sulfur hydrides, it is essential to go beyond not only the static approximation for the screened Coulomb interaction, but also the constant density-of-states approximation for electrons, the harmonic approximation for phonons, and the Migdal approximation for the electron-phonon vertex, all of which have been employed in the standard ME calculation. It is also shown that the feedback effect in the self-consistent calculation of the self-energy and the zero point motion considerably affect the calculation of T c . For alkali-doped fullerides, the interplay between electron-phonon coupling and electron correlations becomes more nontrivial. It has been demonstrated that the combination of density functional theory and dynamical mean field theory with the ab initio downfolding scheme for electron-phonon coupled systems works successfully. This study not only reproduces the experimental phase diagram but also obtains a unified view of the high-T c superconductivity and the Mott-Hubbard transition in the fullerides. The results for these high-T c superconductors will provide a firm ground for future materials design of new superconductors. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Methods used in evaluating data for the interaction of neutrons with light elements (A < 19)

    International Nuclear Information System (INIS)

    Stewart, L.

    1980-01-01

    In the interaction of neutrons with light nuclei, many anomalies are observed. In particular, the probability for gamma-ray production is generally small over most of the neutron energy range. On the other hand, 6 Li, 3 He, 10 B, and 7 Be have thermal absorption cross sections which range from 940 to 48,000 barns. 10 B is the only isotope that has a positive Q for a 3-body reaction, the (n,t2α). As the neutron energy increases, however, 3- and 4-particle direct breakup and sequential formation cross sections dominate the nonelastic for D, T, 6 Li, 7 Be, 10 B, and 12 C above a few MeV. For higher-mass isotopes, particle emission (protons and α's) are often the preferred mode for deexcitation of levels excited via (n,n') reactions, where energetically possible. Very few of these partial cross sections have been measured with the necessary precision. Problems are particularly inherent in experiments on negative-Q reactions near the 3-body threshold. The many-body problem must be treated as several two-body sequential steps in a theoretical analysis; the emitted particle angular distribution is required as input, but is rarely known. Precise knowledge about individual partial cross sections is often important, especially when neutron multiplication, breeding of fusion fuel, radioactive contamination, depletion or buildup of the target, energy transfer, or time-dependent parameters are required. Specific examples are described for the evaluation of neutron interactions with light elements which employ isotopic spin, inverse reactions, charge-conjugate reactions, and the elastic scattering of charged particles (with Wick's Limit). 18 figures, 1 table

  18. The Detailed Chemical Properties of M31 Star Clusters. I. Fe, Alpha and Light Elements

    Science.gov (United States)

    Colucci, Janet E.; Bernstein, Rebecca A.; Cohen, Judith G.

    2014-12-01

    We present ages, [Fe/H] and abundances of the α elements Ca I, Si I, Ti I, Ti II, and light elements Mg I, Na I, and Al I for 31 globular clusters (GCs) in M31, which were obtained from high-resolution, high signal-to-noise ratio >60 echelle spectra of their integrated light (IL). All abundances and ages are obtained using our original technique for high-resolution IL abundance analysis of GCs. This sample provides a never before seen picture of the chemical history of M31. The GCs are dispersed throughout the inner and outer halo, from 2.5 kpc < R M31 < 117 kpc. We find a range of [Fe/H] within 20 kpc of the center of M31, and a constant [Fe/H] ~ - 1.6 for the outer halo clusters. We find evidence for at least one massive GC in M31 with an age between 1 and 5 Gyr. The α-element ratios are generally similar to the Milky Way GC and field star ratios. We also find chemical evidence for a late-time accretion origin for at least one cluster, which has a different abundance pattern than other clusters at similar metallicity. We find evidence for star-to-star abundance variations in Mg, Na, and Al in the GCs in our sample, and find correlations of Ca, Mg, Na, and possibly Al abundance ratios with cluster luminosity and velocity dispersion, which can potentially be used to constrain GC self-enrichment scenarios. Data presented here were obtained with the HIRES echelle spectrograph on the Keck I telescope. The data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.

  19. THE DETAILED CHEMICAL PROPERTIES OF M31 STAR CLUSTERS. I. Fe, ALPHA AND LIGHT ELEMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Colucci, Janet E.; Bernstein, Rebecca A. [The Observatories of the Carnegie Institution for Science, 813 Santa Barbara St., Pasadena, CA 91101 (United States); Cohen, Judith G., E-mail: jcolucci@obs.carnegiescience.edu [Palomar Observatory, Mail Stop 105-24, California Institute of Technology, Pasadena, CA 91125 (United States)

    2014-12-20

    We present ages, [Fe/H] and abundances of the α elements Ca I, Si I, Ti I, Ti II, and light elements Mg I, Na I, and Al I for 31 globular clusters (GCs) in M31, which were obtained from high-resolution, high signal-to-noise ratio >60 echelle spectra of their integrated light (IL). All abundances and ages are obtained using our original technique for high-resolution IL abundance analysis of GCs. This sample provides a never before seen picture of the chemical history of M31. The GCs are dispersed throughout the inner and outer halo, from 2.5 kpc < R {sub M31} < 117 kpc. We find a range of [Fe/H] within 20 kpc of the center of M31, and a constant [Fe/H] ∼ – 1.6 for the outer halo clusters. We find evidence for at least one massive GC in M31 with an age between 1 and 5 Gyr. The α-element ratios are generally similar to the Milky Way GC and field star ratios. We also find chemical evidence for a late-time accretion origin for at least one cluster, which has a different abundance pattern than other clusters at similar metallicity. We find evidence for star-to-star abundance variations in Mg, Na, and Al in the GCs in our sample, and find correlations of Ca, Mg, Na, and possibly Al abundance ratios with cluster luminosity and velocity dispersion, which can potentially be used to constrain GC self-enrichment scenarios. Data presented here were obtained with the HIRES echelle spectrograph on the Keck I telescope.

  20. Examining the Possibility of Carbon as a Light Element in the Core of Mercury

    Science.gov (United States)

    Vander Kaaden, Kathleen; McCubbin, Francis M.; Turner, Amber; Ross, D. Kent

    2017-01-01

    Results from the MErcury Surface, Space ENvironment, GEochemistry and Ranging (MESSENGER) spacecraft have shown elevated abundances of C on the surface of Mercury. Peplowski et al. used GRS data from MESSENGER to show an average northern hemisphere abundance of C on the planet of 0 to 4.1 wt% C at the three-sigma detection limit. Confirmation of C on the planet prompts many questions regarding the role of C during the differentiation and evolution of Mercury. The elevated abundances of both S and C on Mercury's surface, coupled with the low abundances of iron, suggest that the oxygen fugacity of the planet is several log10 units below the Iron-Wustite buffer. These observations spark questions about the bulk composition of Mercury's core. This experimental study seeks to understand the impact of C as a light element on potential mercurian core compositions. In order to address this question, experiments were conducted at 1 GPa and a variety of temperatures (700 - 1500 C) on metal compositions ranging from Si5Fe95 to Si22Fe78, possibly representative of the mercurian core. All starting metals were completely enclosed in a graphite capsule to ensure C saturation at a given set of run conditions. All elements, including C, were analyzed using electron probe microanalysis. Precautions were taken to ensure accurate measurements of C with this technique including using the LDE2 crystal, the cold finger on the microprobe to minimize contamination and increase the vacuum, and an instrument with no oil based pumps. Based on the superliquidus experimental results in the present study, as Fe-rich cores become more Si-rich, the C content of that core composition will decrease. Furthermore, although C concentration at graphite saturation (CCGS) varies from a liquid to a solid, temperature does not seem to play a substantial role in CCGS, at least at 1 GPa.

  1. EXPLORING ANTICORRELATIONS AND LIGHT ELEMENT VARIATIONS IN NORTHERN GLOBULAR CLUSTERS OBSERVED BY THE APOGEE SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Mészáros, Szabolcs [ELTE Gothard Astrophysical Observatory, H-9704 Szombathely, Szent Imre Herceg st. 112 (Hungary); Martell, Sarah L. [Department of Astrophysics, School of Physics, University of New South Wales, Sydney, NSW 2052 (Australia); Shetrone, Matthew [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States); Lucatello, Sara [INAF-Osservatorio Astronomico di Padova, vicolo dell Osservatorio 5, I-35122 Padova (Italy); Troup, Nicholas W.; Pérez, Ana E. García; Majewski, Steven R. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Bovy, Jo [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); Cunha, Katia [University of Arizona, Tucson, AZ 85719 (United States); García-Hernández, Domingo A.; Prieto, Carlos Allende [Instituto de Astrofísica de Canarias (IAC), E-38200 La Laguna, Tenerife (Spain); Overbeek, Jamie C. [Department of Astronomy, Indiana University, Bloomington, IN 47405 (United States); Beers, Timothy C. [Department of Physics and JINA Center for the Evolution of the Elements, University of Notre Dame, Notre Dame, IN 46556 (United States); Frinchaboy, Peter M. [Texas Christian University, Fort Worth, TX 76129 (United States); Hearty, Fred R.; Schneider, Donald P. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Holtzman, Jon [New Mexico State University, Las Cruces, NM 88003 (United States); Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Schiavon, Ricardo P. [Astrophysics Research Institute, IC2, Liverpool Science Park, Liverpool John Moores University, 146 Brownlow Hill, Liverpool, L3 5RF (United Kingdom); and others

    2015-05-15

    We investigate the light-element behavior of red giant stars in northern globular clusters (GCs) observed by the SDSS-III Apache Point Observatory Galactic Evolution Experiment. We derive abundances of 9 elements (Fe, C, N, O, Mg, Al, Si, Ca, and Ti) for 428 red giant stars in 10 GCs. The intrinsic abundance range relative to measurement errors is examined, and the well-known C–N and Mg–Al anticorrelations are explored using an extreme-deconvolution code for the first time in a consistent way. We find that Mg and Al drive the population membership in most clusters, except in M107 and M71, the two most metal-rich clusters in our study, where the grouping is most sensitive to N. We also find a diversity in the abundance distributions, with some clusters exhibiting clear abundance bimodalities (for example M3 and M53) while others show extended distributions. The spread of Al abundances increases significantly as cluster average metallicity decreases as previously found by other works, which we take as evidence that low metallicity, intermediate mass AGB polluters were more common in the more metal-poor clusters. The statistically significant correlation of [Al/Fe] with [Si/Fe] in M15 suggests that {sup 28}Si leakage has occurred in this cluster. We also present C, N, and O abundances for stars cooler than 4500 K and examine the behavior of A(C+N+O) in each cluster as a function of temperature and [Al/Fe]. The scatter of A(C+N+O) is close to its estimated uncertainty in all clusters and independent of stellar temperature. A(C+N+O) exhibits small correlations and anticorrelations with [Al/Fe] in M3 and M13, but we cannot be certain about these relations given the size of our abundance uncertainties. Star-to-star variations of α-element (Si, Ca, Ti) abundances are comparable to our estimated errors in all clusters.

  2. Deuterides of light elements: low-temperature thermonuclear burn-up and applications to thermonuclear fusion problems

    International Nuclear Information System (INIS)

    Frolov, A.M.; Smith, V.H.; Smith, G.T.

    2002-01-01

    Thermonuclear burn-up and thermonuclear applications are discussed for a number of deuterides and DT hydrides of light elements. These deuterides and corresponding DT hydrides are often used as thermonuclear fuels or components of such fuels. In fact, only for these substances thermonuclear energy gain exceeds (at some densities and temperatures) the bremsstrahlung loss and other high-temperature losses, i.e., thermonuclear burn-up is possible. Herein, thermonuclear burn-up in these deuterides and DT hydrides is considered in detail. In particular, a simple method is proposed to determine the critical values of the burn-up parameter x c for these substances and their mixtures at different temperatures and densities. The results for equimolar DT mixtures coincide quite well with the results of previous calculations. Also, the natural or Z limit is determined for low-temperature thermonuclear burn-up in the deuterides of light elements. (author)

  3. Investigating Planetesimal Evolution by Experiments with Fe-Ni Metallic Melts: Light Element Composition Effects on Trace Element Partitioning Behavior

    Science.gov (United States)

    Chabot, N. L.

    2017-12-01

    As planetesimals were heated up in the early Solar System, the formation of Fe-Ni metallic melts was a common occurrence. During planetesimal differentiation, the denser Fe-Ni metallic melts separated from the less dense silicate components, though some meteorites suggest that their parent bodies only experienced partial differentiation. If the Fe-Ni metallic melts did form a central metallic core, the core eventually crystallized to a solid, some of which we sample as iron meteorites. In all of these planetesimal evolution processes, the composition of the Fe-Ni metallic melt influenced the process and the resulting trace element chemical signatures. In particular, the metallic melt's "light element" composition, those elements present in the metallic melt in a significant concentration but with lower atomic masses than Fe, can strongly affect trace element partitioning. Experimental studies have provided critical data to determine the effects of light elements in Fe-Ni metallic melts on trace element partitioning behavior. Here I focus on combining numerous experimental results to identify trace elements that provide unique insight into constraining the light element composition of early Solar System Fe-Ni metallic melts. Experimental studies have been conducted at 1 atm in a variety of Fe-Ni systems to investigate the effects of light elements on trace element partitioning behavior. A frequent experimental examination of the effects of light elements in metallic systems involves producing run products with coexisting solid metal and liquid metal phases. Such solid-metal-liquid-metal experiments have been conducted in the Fe-Ni binary system as well as Fe-Ni systems with S, P, and C. Experiments with O-bearing or Si-bearing Fe-Ni metallic melts do not lend themselves to experiments with coexisting solid metal and liquid metal phases, due to the phase diagrams of these elements, but experiments with two immiscible Fe-Ni metallic melts have provided insight into

  4. Production of light elements by cascades from energetic antiprotons in the early Universe and problem of nuclear cosmoarcheology

    International Nuclear Information System (INIS)

    Levitan, Yu.L.; Sobol', I.M.; Khlopov, M.Yu.; Chechetkin, V.M.

    1988-01-01

    The mathematical model of the process of light-element (D and 3 He) production due to disintegration of 4 He nuclei, induced by nonequilibrium processes of production of energetic antiprotons in the early Universe is suggested. Numerical calculations show that formation of the nucleon cascade induced by antiproton slowing down increases the D and 3 He yield due to the growth of probability of disintegration of several 4 He nuclei by a single antiproton and due to disintegration of such nuclei by cascade protons. Restraints on the concentration of possible sources of energetic antiprotons in the early Universe are strengthened respectively

  5. Elastic forward analysis using sup 7 Li ions A useful tool for H and light elements determination

    CERN Document Server

    Romero, S; Murillo, G; Berdejo, H M

    2002-01-01

    Films of CN sub x /Si, TiN sub x /AISI 304 and AlO sub x /Si were analyzed with sup 7 Li ions from 4.0 to 4.5 MeV and an experimental arrangement that, through detection of scattered projectiles and recoils by a single detector, allows quantification of H, light elements and heavier ones. A discussion is presented of the capabilities of Rutherford backscattering spectrometry (RBS) and conventional elastic recoil detection analysis (ERDA) compared to elastic forward analysis.

  6. A system for on-line monitoring of light element concentration distributions in thin samples

    NARCIS (Netherlands)

    Brands, P.J.M.; Mutsaers, P.H.A.; Voigt, de M.J.A.

    1999-01-01

    At the Cyclotron Laboratory, a scanning proton microprobe is used to determine concentration distributions in biomedical samples. The data acquired in these measurements used to be analysed in a time consuming off-line analysis. To avoid the loss of valuable measurement and analysis time, DYANA was

  7. Summary report of the second research co-ordination meeting on improvement of the standard cross sections for light elements

    International Nuclear Information System (INIS)

    Carlson, A.D.; Hale, G.M.; Pronyaev, V.G.

    2004-03-01

    Results are presented following one and a half years of work under the Coordinated Research Project (CRP) on Improvement of the Standard Cross Sections for Light Elements. They include the use of the refined resonating group model for the theoretical prediction of the R-matrix poles and preliminary R-matrix model fits of the full experimental database for the 6 Li+n system obtained with different codes. Significant attention was paid to the exclusion of the bias in the evaluated data caused by the possible presence of Peelle's Pertinent Puzzle effect in the experimental data. Updates were also presented of the experimental database for light and heavy element standards including fission cross sections up to 200 MeV. First results and observed trends for all standard reactions are given, including the preliminary results of combining the model (for light elements) and non-model fits. The timetable for further work was agreed, which should lead to new reaction cross section standards for light and heavy elements by the end 2004. (author)

  8. Possible existence of cosmological cosmic rays I. the framework for light-element and gamma-ray production

    International Nuclear Information System (INIS)

    Montmerle, T.

    1977-01-01

    This paper examines the possibility of the existence of cosmological cosmic rays (CCR), in the framework of big-bang cosmology. The model assumes a total energy spectrum similar to that observed at Earth at high energies, a composition of protons and α-particles only, with α/p=0.1. Following Stecker, the CCR are assumed to be born in a burst at some (high) redshift z/sub s/. Gamma-rays originate from π 0 decay resulting from interactions of the high-energy part of the CCR, and light elements are produced via (pα)+(pα) reactions by the low-energy part, both of them by collisions with the ambient matter (of density corresponding to a deceleration parameter q 0 ).The 1--100 MeV γ-ray background spectrum and the lithium abundance are considered as observational constraints on the possible CCR flux intensity. To this end, a theoretical framework is set for simultaneous γ-ray and light-element production by solving a system of coupled time-dependent transport equations, taking ionization and expansion losses into account. The absolute lithium abundance is calculated by normalizing the CCR flux to the observed γ-ray background spectrum; numerical results will be given and discussed in a separate paper, as a function of q 0 and z/sub s/

  9. Characterization of light element impurities in ultrathin silicon-on-insulator layers by luminescence activation using electron irradiation

    International Nuclear Information System (INIS)

    Nakagawa-Toyota, Satoko; Tajima, Michio; Hirose, Kazuyuki; Ohshima, Takeshi; Itoh, Hisayoshi

    2009-01-01

    We analyzed light element impurities in ultrathin top Si layers of silicon-on-insulator (SOI) wafers by luminescence activation using electron irradiation. Photoluminescence (PL) analysis under ultraviolet (UV) light excitation was performed on various commercial SOI wafers after the irradiation. We detected the C-line related to a complex of interstitial carbon and oxygen impurities and the G-line related to a complex of interstitial and substitutional carbon impurities in the top Si layer with a thickness down to 62 nm after electron irradiation. We showed that there were differences in the impurity concentration depending on the wafer fabrication methods and also that there were variations in these concentrations in the respective wafers. Xenon ion implantation was used to activate top Si layers selectively so that we could confirm that the PL signal under the UV light excitation comes not from substrates but from top Si layers. The present method is a very promising tool to evaluate the light element impurities in top Si layers. (author)

  10. Elastic recoil atomic spectroscopy of light elements with sub-nanometer depth resolution; Elastische Rueckstossatomspektrometrie leichter Elemente mit Subnanometer-Tiefenaufloesung

    Energy Technology Data Exchange (ETDEWEB)

    Kosmata, Marcel

    2011-06-30

    heavy ion irradiation. It is shown that the used energies occur both electronic sputtering and electronically induced interface mixing. Electronic sputtering is minimised by using optimised beam parameters. For most samples the effect is below the detection limit for a fluence sufficient for the analysis. However, the influence of interface mixing is so strong that it has to be included in the analysis of the layers of the depth profiles. It is concluded from these studies that at the Rossendorf 5 MV tandem accelerator chlorine ions with an energy of 20 MeV deliver the best results. In some cases, such as the analysis of boron, the energy must be reduced to 6.5 MeV in order to retain the electronic sputtering below the detection limit. The fourth focus is the study of the influence of specific sample properties, such as surface roughness, on the shape of a measured energy spectra and respectively on the analysed depth profile. It is shown that knowledge of the roughness of a sample at the surface and at the interfaces for the analysis is needed. In addition, the contribution parameters limiting the depth resolution are calculated and compared with the conventional ion beam analysis. Finally, a comparison is made between the highresolution ion beam analysis and complementary methods published by other research groups. The fifth and last focus is the analysis of light elements in ultra thin layers. All models presented in this thesis to reduce the influence of beam damage are taken into account. The dynamic non-equilibrium charge state is also included for the quantification of elements. Depth profiling of multilayer systems is demonstrated for systems consisting of SiO{sub 2}-Si{sub 3}N{sub 4}O{sub x}-SiO{sub 2} on silicon, boron implantation profiles for ultra shallow junctions and ultra thin oxide layers, such as used as high-k materials.

  11. X-ray fluorescence diffractionless analyzer for determining light element content in iron ore mixtures

    International Nuclear Information System (INIS)

    Yuksa, L.K.; Kochmola, N.M.; Bondarenko, V.P.; Bogdanov, V.K.

    1986-01-01

    Diffractionless X-ray fluorescence analyzer for detecting calcium oxide and silicon dioxide contents in dry iron ore materials has been developed. The analyzer includes a charging unit, sample-conveying device, spectrometric units for detecting calcium and silicon, computing racks and sample-removing device. Results of calcium oxide and silicon dioxide analyses in iron ore mixtures are presented. Errors are evaluated. It is shown that the analyzer provides high accuracy of one-time determinations, as well as reading constancy for a long time

  12. Summary report of the first research co-ordination meeting on improvement of the standard cross sections for light elements

    International Nuclear Information System (INIS)

    Carlson, A.D.; Hale, G.M.; Pronyaev, V.G.

    2003-01-01

    Results obtained during the first six months of the Coordinated Research Project (CRP) on Improvement of the Standard Cross Sections for Light Elements were presented. Attention focused on studies of the reduction in uncertainty for the model and non-model least squares fits, intercomparison and testing of different computer codes based on the nuclear model, non-model general least square and Bayesian approaches to the evaluation of standard reaction cross sections and covariance matrix of their uncertainties. The reasons leading to the underestimation of uncertainties and bias in the evaluated values were discussed and solutions to these problems were outlined. A coordinated working plan was prepared which will result in the preparation of new reaction cross section standards for light and heavy elements by 2004. (author)

  13. Contribution to the study of copper and copper-arsenic archaeo-metallurgy using light element analysis and experimental fusion

    International Nuclear Information System (INIS)

    Papillon, F.

    1997-01-01

    The objective of this study is to try a direct reconstruction from ancient artefacts of the elaboration technology used in the dawning copper metallurgy. This word is based on both the light elements analysis and the carry out of the principles of physical metallurgy. However the study of an archaeological artefact necessitates the use of non destructive methods. A main aspect of this work consists in developing the most adequate metallographic technique and the methods for the determination of oxygen and carbon by ion beam analysis. Additionally experimental melting of copper and copper arsenic alloys were carried out in laboratory, under various temperature and atmosphere conditions, and 'on the field' in Archeodrome de Beaune, in order to reconstruct part of the prehistorical craftsmanship. The results of measurement are consistent with our general knowledge of oxido-reduction phenomena and the behaviour of copper and copper arsenic alloys s in agreement with the prediction of thermodynamics. The nuclear analysis of three ancient artefacts showed that the oxygen and carbon contents were closer to those of the Archeodrome than those of the laboratory. Further studies of the field should consider all parameters controlling the physical-chemistry of charcoal fire. (author)

  14. Light element atom, molecule and radical behaviour in the divertor and edge plasma regions. Summary report of the 1. research coordination meeting

    International Nuclear Information System (INIS)

    Braams, B.J.

    2010-01-01

    The first research coordination meeting of the Coordinated Research Project (CRP) on Light Element Atom, Molecule and Radical Behaviour in the Divertor and Edge Plasma Regions was held 18-20 November 2009 at IAEA headquarters, bringing together experts representing 14 institutions. Participants summarized their recent and ongoing work pertinent to the research project. The specific objectives of the CRP and a detailed work plan were formulated. The discussions, conclusions and recommendations of the meeting are summarized in this report. (author)

  15. Analytical electron microscope based on scanning transmission electron microscope with wavelength dispersive x-ray spectroscopy to realize highly sensitive elemental imaging especially for light elements

    International Nuclear Information System (INIS)

    Koguchi, Masanari; Tsuneta, Ruriko; Anan, Yoshihiro; Nakamae, Koji

    2017-01-01

    An analytical electron microscope based on the scanning transmission electron microscope with wavelength dispersive x-ray spectroscopy (STEM-WDX) to realize highly sensitive elemental imaging especially for light elements has been developed. In this study, a large-solid-angle multi-capillary x-rays lens with a focal length of 5 mm, long-time data acquisition (e.g. longer than 26 h), and a drift-free system made it possible to visualize boron-dopant images in a Si substrate at a detection limit of 0.2 atomic percent. (paper)

  16. Ion beam techniques for the analysis of light elements in thin films, including depth profiling. Final report of a co-ordinated research project 2000-2003

    International Nuclear Information System (INIS)

    2004-10-01

    This publication highlights the achievements of a Coordinated Research Project (CRP) to promote the potential of accelerator-based nuclear techniques of analysis for light elements in thin films. The objectives of this CRP were to develop a coordinated research effort between accelerator laboratories and materials science research groups in order to assist and promote the development of quality assurance methods, to evaluate databases of parameters needed for quantitative analysis, and to develop and apply techniques to selected problems concerning the surface modification of materials and production of thin films. Through various case studies, this publication assesses and demonstrates the effectiveness of accelerator-based nuclear techniques for analysis to provide valuable data and knowledge not readily accessible using other methods

  17. Transportation of natural radionuclides and rare earth light elements in the lagoon system of Buena, RJ; Transporte de radionuclideos naturais e elementos das terras raras leves no sistema lagunar de Buena, RJ

    Energy Technology Data Exchange (ETDEWEB)

    Lauria, Dejanira da Costa

    1999-03-15

    it was investigated the transport of the series natural radionuclides and the earth rare light elements in a coastal lagoon system, located in a monazite rich region, in the coast north region of Rio de Janeiro state. The lagoon water showed off abnormal concentrations of radium isotopes and of the earth rare light elements (ERLEs). The longitudinal gradient of the Ra, of the ERLEs and of the major ion concentration's, whose data were obtained during two and half years of the research at the place, and the statistical analysis pointed to two mainly source as responsible for the water lagoon composition - the marine and the underground waters. The underground water supplies the radionuclides and ERLEs, possibly originated by monazite lixiviation. Based on the water speciation modeling, the results of laboratory adsorption on sediment experiments and the sediment characterization, the behavior of the radio isotopes, the ERLEs, U, Th e Pb-210, along of the lagoon, are discussed. It is also discussed the role of the aquatic macrophyte Typha dominguesis Pers in the nuclide uptake and the following liberation. (author)

  18. Total CMB analysis of streaker aerosol samples by PIXE, PIGE, beta- and optical-absorption analyses

    International Nuclear Information System (INIS)

    Annegarn, H.J.; Przybylowicz, W.J.

    1993-01-01

    Multielemental analyses of aerosol samples are widely used in air pollution receptor modelling. Specifically, the chemical mass balance (CMB) model has become a powerful tool in urban air quality studies. Input data required for the CMB includes not only the traditional X-ray fluorescence (and hence PIXE) detected elements, but also total mass, organic and inorganic carbon, and other light elements including Mg, Na and F. The circular streaker sampler, in combination with PIXE analysis, has developed into a powerful tool for obtaining time-resolved, multielemental aerosol data. However, application in CMB modelling has been limited by the absence of total mass and complementary light element data. This study reports on progress in using techniques complementary to PIXE to obtain additional data from circular streaker samples, maintaining the nondestructive, instrumental approach inherent in PIXE: Beta-gauging using a 147 Pm source for total mass; optical absorption for inorganic carbon; and PIGE to measure the lighter elements. (orig.)

  19. A view of the H-band light-element chemical patterns in globular clusters under the AGB self-enrichment scenario

    Science.gov (United States)

    Dell'Agli, F.; García-Hernández, D. A.; Ventura, P.; Mészáros, Sz; Masseron, T.; Fernández-Trincado, J. G.; Tang, B.; Shetrone, M.; Zamora, O.; Lucatello, S.

    2018-04-01

    We discuss the self-enrichment scenario by asymptotic giant branch (AGB) stars for the formation of multiple populations in globular clusters (GCs) by analysing data set of giant stars observed in nine Galactic GCs, covering a wide range of metallicities and for which the simultaneous measurements of C, N, O, Mg, Al, and Si are available. To this aim, we calculated six sets of AGB models, with the same chemical composition as the stars belonging to the first generation of each GC. We find that the AGB yields can reproduce the set of observations available, not only in terms of the degree of contamination shown by stars in each GC but, more important, also the observed trend with metallicity, which agrees well with the predictions from AGB evolution modelling. While further observational evidences are required to definitively fix the main actors in the pollution of the interstellar medium from which new generation of stars formed in GCs, the present results confirm that the gas ejected by stars of mass in the range 4 M_{⊙} ≤ M ≤ 8 M_{⊙} during the AGB phase share the same chemical patterns traced by stars in GCs.

  20. Adaptation of a radiofrequency glow discharge optical emission spectrometer (RF-GD-OES) to the analysis of light elements (carbon, nitrogen, oxygen and hydrogen) in solids: glove box integration for the analysis of nuclear samples

    International Nuclear Information System (INIS)

    Hubinois, J.-C.

    2001-01-01

    The purpose of this work is to use the radiofrequency glow discharge optical emission spectrometry in order to quantitatively determine carbon, nitrogen, oxygen and hydrogen at low concentration (in the ppm range) in nuclear materials. In this study, and before the definitive contamination of the system, works are carried out on non radioactive materials (steel, pure iron, copper and titanium). As the initial apparatus could not deliver a RF power inducing a reproducible discharge and was not adapted to the analysis of light elements: 1- The radiofrequency system had to be changed, 2- The systems controlling gaseous atmospheres had to be improved in order to obtain analytical signals stemming strictly from the sample, 3- Three discharge lamps had to be tested and compared in terms of performances, 4- The system of collection of light had to be optimized. The modifications that were brought to the initial system improved intensities and stabilities of signals which allowed lower detection limits (1000 times lower for carbon). These latter are in the ppm range for carbon and about a few tens of ppm for nitrogen and oxygen in pure irons. Calibration curves were plotted in materials presenting very different sputtering rates in order to check the existence of a 'function of analytical transfer' with the purpose of palliating the lack of reference materials certified in light elements at low concentration. Transposition of this type of function to other matrices remains to be checked. Concerning hydrogen, since no usable reference material with our technique is available, certified materials in deuterium (chosen as a surrogate for hydrogen) were studied in order to exhibit the feasibility the analysis of hydrogen. Parallel to these works, results obtained by modeling a RF discharge show that the performances of the lamp can be improved and that the optical system must be strictly adapted to the glow discharge. (author) [fr

  1. Hydrides and Borohydrides of Light Elements

    Science.gov (United States)

    1947-12-04

    Troy, Attn: Inst. of Naval Science (30) Solar Aircraft Cu,, San Diego, Attn: Dr. M. A. Williamson " (31) INSMAT. N. J. for Itandard Oil Co., Esso Lab...with the other# iLD F.Re p. 8 ilt -ms" #61ggSotod that.. ir addition to thc impurity in the t~y..thr, an impurkty, prosumably aluminum hydride, in

  2. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  3. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  4. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  5. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  6. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  7. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  8. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  9. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  10. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  11. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  12. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  13. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  14. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  15. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  16. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  17. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  18. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  19. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  20. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  1. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  2. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  3. Compton scattering of photons from electrons bound in light elements

    International Nuclear Information System (INIS)

    Bergstrom, P.M. Jr.

    1994-01-01

    A brief introduction to the topic of Compton scattering from bound electrons is presented. The fundamental nature of this process in understanding quantum phenomena is reviewed. Methods for accurate theoretical evaluation of the Compton scattering cross section are presented. Examples are presented for scattering of several keV photons from helium

  4. Light element geochemistry and spallogenesis in lunar rocks

    International Nuclear Information System (INIS)

    Des Marais, D.J.

    1983-01-01

    The abundances and isotopic compositions of carbon, nitrogen and sulfur were measured in eleven lunar rocks. Samples were combusted sequentially at three temperatures to resolve terrestrial contamination from indigenous volatiles. Sulfur abundances in Apollo 16 highland rocks range from 73 to 1165 μg/g, whereas sulfur contents in Apollo 15 and 17 basalts range from 719 to 1455 μg/g and correlate with TiO 2 content. Lunar rocks as a group have a remarkably uniform sulfur isotopic composition, which may reflect the low oxygen fugacity of the basaltic magmas. Much of the range of reported delta 34 Ssub(CD) values is caused by systematic analytical discrepancies between laboratories. Lunar rocks very likely contain less than 0.1 μg/g of nitrogen. The measured spallogenic production rate, 4.1 x 10 -6 μg 15 N/g sample/m.y., agrees remarkably closely with previous estimates. An estimate which includes all available data is 3.7 x 10 -6 μg 15 N/g sample/m.y. Lunar basalts may contain no indigenous lunar carbon in excess of procedural blank levels. Highland rocks consistently release about 1 to 5 μg/g of carbon in excess of blank levels, but this carbon might either derive from ancient meteoritic debris or be a mineralogic product of terrestrial weathering. The average measured spallogenic 13 C production rate is 4.1 x 10 -6 μg 13 C/g sample/m.y. (author)

  5. Light element nucleosynthesis and estimates of the universal baryon density

    International Nuclear Information System (INIS)

    Mathews, G.J.; Viola, V.E.

    1978-01-01

    The present mean universal baryon density rho/sub b/, is of interest because it and the Hubble constant determine the curvature of the Universe. The available indicators of rho/sub b/ come from the present deuterium abundance, if it is assumed that ''big-bang'' nucleosynthesis must produce enough D to at least match the abundance of this nuclide in the interstellar medium. An alternative method utilizing the 7 Li/D ratio is used to evaluate rho/sub b/. With this method the difficulty associated with the astration process can be essentially canceled from the problem. The results obtained indicate an open Universe with a best guess for rho/sub b/ of 7.1 x 10 -31 g/cm 3 . 1 figure, 1 table

  6. Big Bang nucleosynthesis and abundances of light elements

    International Nuclear Information System (INIS)

    Pagel, B.E.J.

    1991-01-01

    Big Bang nucleosynthesis (BBNS) theory is sketched, indicating the dependence of primordial abundances of D, 3 He, 4 He and 7 Li on the mean baryonic density of the universe and the dependence of 4 He on the number of neutrino families and the neutron half-life. Observational data and inferred primordial abundances of these elements are reviewed and shown to be consistent (within errors) either with standard BBNS in a homogeneous universe about 100 seconds after the Big Bang or with moderately inhomogeneous BBNS models resulting from earlier phase transitions like the quark-hadron transition if this is first order. However, models with closure density supplied by baryons are apparently ruled out. Finally, implications for the existence of baryonic and non-baryonic dark matter are briefly discussed. (orig.)

  7. Study of deuterons induced nuclear reactions on light elements (N, Al and Si): Application to containment materials of radioactive wastes; Etude des reactions nucleaires induites par des deuterons sur des elements legers (N, Al, Si): application aux materiaux de confinement des dechets radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    Pellegrino, St

    2004-03-01

    Nuclear reaction analysis is well adapted to the quantification of light element. Profiles of concentration in order to follow elements migration into materials can be undertaken. This technique is used to study the behavior of the future matrices for nuclear waste containment. This technique is isotopic, characterized by a good signal-to-background ratio and a very low detection limit. The probability of a nuclear reaction is linked to a parameter called 'cross section' we have to know in order to carry out quantitative analysis. We have determined excitation curves for nitrogen, aluminium and silicon. These experiments were done with deuterons from 0.5 to 2 MeV. Two methods for the cross section characterization are presented and are in agreement with each other. The second one reduces uncertainty. Data are incorporated in the simulation software SIMNRA. We have compared the results obtained on different samples when we use data in literature or data of the study. We have noticed a great fit improvement with the data of this study. The new cross sections of this work will be integrated in the general data base SIGMABASE. Applications on materials such as Si{sub 3}N{sub 4}, nano-metric powders, WCN and nuclear glass YLaMgSiAlON studied for radioactive waste containment are also presented. (author)

  8. Examples of analysis by activation; Exemples d'analyse par activation

    Energy Technology Data Exchange (ETDEWEB)

    Leveque, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1955-07-01

    We were used various nuclear reactions to do some analysis by neutron or by X-ray activation. We used the french reactor core Zoe as neutrons sources and an Allis-Chalmers betatron as X-rays sources for the dosage of the light elements. The described processes were revealed fast and particularly useful for determination of traces. The fact that most of them don't require any chemical operations, is especially substantial when the solubilization of the sample is difficult. (M.B.) [French] Nous avons utilise des reactions nucleaires diverses pour effectuer des analyses par activation neutronique ou par rayon X. Nous avons utilise la pile francaise Zoe comme sources de neutrons et un betatron Allis-Chalmers comme sources de rayons X pour le dosage des elements legers. Les procedes decrits se sont reveles rapides et particulierement utiles dans la determination des traces. Le fait que, pour la plupart, ils n'exigent pas d'operations chimiques, est particulierement appreciable lorsque la solubilisation des echantillons est difficile. (M.B.)

  9. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  11. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  12. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  13. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  14. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  15. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  16. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  17. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  18. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  19. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  20. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  1. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  2. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    In 2001 the activity in the field of safety analyses was focused on verification of the safety analyses reports for NPP V-2 Bohunice and NPP Mochovce concerning the new profiled fuel and probabilistic safety assessment study for NPP Mochovce. The calculation safety analyses were performed and expert reviews for the internal UJD needs were elaborated. An important part of work was performed also in solving of scientific and technical tasks appointed within bilateral projects of co-operation between UJD and its international partnership organisations as well as within international projects ordered and financed by the European Commission. All these activities served as an independent support for UJD in its deterministic and probabilistic safety assessment of nuclear installations. A special attention was paid to a review of probabilistic safety assessment study of level 1 for NPP Mochovce. The probabilistic safety analysis of NPP related to the full power operation was elaborated in the study and a contribution of the technical and operational improvements to the risk decreasing was quantified. A core damage frequency of the reactor was calculated and the dominant initiating events and accident sequences with the major contribution to the risk were determined. The target of the review was to determine the acceptance of the sources of input information, assumptions, models, data, analyses and obtained results, so that the probabilistic model could give a real picture of the NPP. The review of the study was performed in co-operation of UJD with the IAEA (IPSART mission) as well as with other external organisations, which were not involved in the elaboration of the reviewed document and probabilistic model of NPP. The review was made in accordance with the IAEA guidelines and methodical documents of UJD and US NRC. In the field of calculation safety analyses the UJD activity was focused on the analysis of an operational event, analyses of the selected accident scenarios

  3. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  4. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  5. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  6. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  7. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  8. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  9. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  10. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  11. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  12. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  13. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  14. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  15. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  16. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  17. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  18. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  19. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  20. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  1. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  2. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  3. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  4. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  5. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  6. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  7. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  8. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  9. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  10. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  11. Period Study and Analyses of 2017 Observations of the Totally Eclipsing, Solar Type Binary, MT Camelopardalis

    Science.gov (United States)

    Faulkner, Danny R.; Samec, Ronald G.; Caton, Daniel B.

    2018-06-01

    We report here on a period study and the analysis of BVRcIc light curves (taken in 2017) of MT Cam (GSC03737-01085), which is a solar type (T ~ 5500K) eclipsing binary. D. Caton observed MT Cam on 05, 14, 15, 16, and 17, December 2017 with the 0.81-m reflector at Dark Sky Observatory. Six times of minimum light were calculated from four primary eclipses and two secondary eclipses:HJD I = 24 58092.4937±0.0002, 2458102.74600±0.0021, 2458104.5769±0.0002, 2458104.9434±0.0029HJD II = 2458103.6610±0.0001, 2458104.7607±0.0020,Six times of minimum light were also calculated from data taken by Terrell, Gross, and Cooney, in their 2016 and 2004 observations (reported in IBVS #6166; TGC, hereafter). In addition, six more times of minimum light were taken from the literature. From all 18 times of minimum light, we determined the following light elements:JD Hel Min I=2458102.7460(4) + 0.36613937(5) EWe found the orbital period was constant over the 14 years spanning all observations. We note that TGC found a slightly increasing period. However, our results were obtained from a period study rather than comparison of observations from only two epochs by the Wilson-Devinney (W-D) Program. A BVRcIc Johnson-Cousins filtered simultaneous W-D Program solution gives a mass ratio (0.3385±0.0014) very nearly the same as TGC’s (0.347±0.003), and a component temperature difference of only ~40 K. As with TGC, no spot was needed in the modeling. Our modeling (beginning with Binary Maker 3.0 fits) was done without prior knowledge of TGC’s. This shows the agreement achieved when independent analyses are done with the W-D code. The present observations were taken 1.8 years later than the last curves by TGC, so some variation is expected.The Roche Lobe fill-out of the binary is ~13% and the inclination is ~83.5 degrees. The system is a shallow contact W-type W UMa Binary, albeit, the amplitudes of the primary and secondary eclipse are very nearly identical. An eclipse duration of ~21

  12. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  13. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  14. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  15. Use of particles other than neutrons in activation analysis; Emploi de particules autres que les neutrons en analyse par actuation

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Ch [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-15

    Nuclear reactions obtained by irradiation in {gamma} Bremsstrahlung, {alpha} particles and protons are particularly suitable for dosing very small traces of light elements. We consider the possibilities presented by activation in {gamma} radiation of 28 MeV maximum energy, mainly for the measurement of C, F, N, O, P and S. Non-destructive methods of analysis for beryllium are described. Under certain conditions they may also be used for other elements such as B, Ca, Li and Na. We give also the results of our first experiments carried out in an attempt to find a method for dosing carbon and oxygen by irradiation in {alpha} particles and protons. For each type of activation the possible types of interference with other nuclear refections are considered. (author) [French] Des reactions nucleaires obtenues par irradiation dans des rayons {gamma} de freinage, des particules {alpha} et des protons, sont particulierement indiquees pour les dosages de traces ultimes de certains elements legers. Nous etudions les possibilites offertes par les activations en rayons {alpha} d'energie maximum 28 MeV, principalement pour les dosages de C, F, N, O, P et S. Des methodes d'analyse non destructives appliquees au beryllium sont decrites. Sous certaines conditions, elles peuvent egalement etre utilisees pour d'autres materiaux comme B, Ca, Li et Na. Nous donnons d'autre part les resultats de nos premieres experiences effectuees pour la mise au point des methodes de dosage du carbone et de l'oxygene par irradiation dans les particules {alpha} et les protons. Pour chaque type d'activation, les possibilites d'interferences avec d'autres reactions nucleaires sont examinees. (auteur)

  16. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  17. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  18. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  19. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  20. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  1. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  2. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  3. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  4. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  5. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  6. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  7. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  8. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  9. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  10. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  11. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  12. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  13. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  14. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  15. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  16. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  17. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  18. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  19. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  20. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  1. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  2. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  3. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  4. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  5. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  6. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  7. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  8. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  9. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  10. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  11. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  12. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  13. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  14. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  15. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  16. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  17. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  18. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  19. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  20. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  1. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  2. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  3. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  4. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  5. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  6. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  7. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  8. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  9. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  10. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  11. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  12. High-energy elastic recoil detection heavy ions for light element analysis

    International Nuclear Information System (INIS)

    Goppelt-Langer, P.; Yamamoto, S.; Takeshita, H.; Aoki, Y.; Naramoto, H.

    1994-01-01

    The detection of light and medium heavy elements in not homogeneous solids is a severe problem in ion beam analysis. Heavy elements can be detected by the well established Rutherford backscattering technique (RBS). In a homogeneous host material most impurities can be easily analyzed by secondary ion mass spectroscopy (SIMS). Some isotopes ( 3 He, 6 Li, 10 B) can be measured by nuclear reaction analysis (NRA) using thermal neutrons inducing (n, p) or (n, α) reactions. Others can be detected by energetic ion beams by nuclear reactions (e.g. 15 N( 1 H, αγ) 12 C for analysis of hydrogen). A high content of H, D or T can be also determined by elastic recoil detection using an energetic He beam. The latter technique has been developed to a universal method for detection of light and heavy elements in any target, using a high energetic heavy ion beam and a detector system, which is able to identify the recoils and delivers energy and position of the particles. (author)

  13. Neutron cross sections measurements for light elements at ORELA and their application in nuclear criticality

    International Nuclear Information System (INIS)

    Guber, Klaus H.; Leal, Luiz C.; Sayer, Royce O.; Spencer, Robert R.; Koehler, Paul E.; Valentine, Timothy E.; Derrien, Herve; Harvey, John A.

    2002-01-01

    The Oak Ridge Electron Linear Accelerator (ORELA) was used to measure neutron total and capture cross sections of aluminium, natural chlorine and silicon in the energy range from 100 eV to ∼600 keV. ORELA is the only high power white neutron source with excellent time resolution and ideally suited for these experiments still operating in the USA. These measurements were carried out to support the Nuclear Criticality Predictability Program. Concerns about the use of existing cross section data in the nuclear criticality calculations using Monte Carlo codes and benchmarks have been a prime motivator for the new cross section measurements. More accurate nuclear data are not only needed for these calculations but also serve as input parameters for s-process stellar models. (author)

  14. Separation and sampling technique of light element isotopes by chemical exchange process

    International Nuclear Information System (INIS)

    Kato, Shunsaku; Oi, Kenta; Takagi, Norio; Hirotsu, Takafumi; Kano, Hirofumi; Sonoda, Akinari; Makita, Yoji

    2000-01-01

    Lithium and boron isotope separation technique were studied. Granulation of lithium isotope separation agent was carried out by cure covering in solution. Separation of lithium isotope was stepped up by ammonium carbonate used as elusion agent. Styrene and ester resin derived three kinds of agents such as 2-amino-1, 3-propanediol (1, 3-PD), 2-amino-2-methyl-1, 3-propanediol (Me-1,3-PD) and tris(2-hydroxyethyl)amine (Tris) were used as absorbent.The ester resin with Tris showed larger amount of adsorption (1.4 mmol/g) than other resins. However, all resins with agent indicated more large adsorption volume of boron than the objective value (0.5 mmol/g). Large isotope shift was shown by the unsymmetrical vibration mode of lithium ion on the basis of quantum chemical calculation of isotope effect on dehydration of hydrated lithium ion. (S.Y.)

  15. Modeling and assessment of the response of super-light elements to fire

    DEFF Research Database (Denmark)

    Hertz, Kristian Dahl; Campeanu, B.M.; Giraudo, M.

    2013-01-01

    Due to the significant weight of the elements, which raise the construction and transportation costs and the CO2 production, concrete buildings may not meet the requirements for sustainable constructions. Furthermore, concrete is quite vulnerable to fire, as it undergoes a permanent degradation...... of its mechanical properties at temperatures commonly reached by structural elements during a fire in a building. As a consequence, several multi-story concrete buildings have collapsed or suffered major structural damages because of fire, and caused injuries and casualties among the occupants. Even...... in those cases, where a safe evacuation of the building is ensured, the high costs associated with the downtime and reparation of the building can be very high and not acceptable in the view of a safe and sustainable design of structures. In this respect, the newly patented building technology...

  16. Inequilibrium cosmological light element nucleosynthesis. Calculations by the Monte Carlo method

    International Nuclear Information System (INIS)

    Khlopov, M.Yu.; Levitan, Yu.L.; Sedel'nikov, E.V.; Sobol, I.M.

    1993-07-01

    Formation of light nuclei ( 6 Li, 7 Li, 7 Be) is studied by Monte Carlo simulation of interactions between 4 He nuclei and inequilibrium fluxes of D,Γ, 3 He, 4 He nuclei, that were produced in nuclear cascades induced by decay products of hypothetical metastable objects in early Universe. The dependence of the amount of 6 Li, 7 Li, 7 Be nuclei on parameters of an analytic approximation of the experimental momentum distribution of secondary nuclei in N(N-bar) induced 4 He dissociation is analyzed. (author). 3 refs, 2 tabs

  17. Improvement of graphite crystal analyzer for light elements on X-ray fluorescence holography measurement

    Science.gov (United States)

    Happo, Naohisa; Hada, Takuma; Kubota, Atsushi; Ebisu, Yoshihiro; Hosokawa, Shinya; Kimura, Koji; Tajiri, Hiroo; Matsushita, Tomohiro; Hayashi, Kouichi

    2018-05-01

    Using a graphite crystal analyzer, focused monochromatic fluorescent X-rays can be obtained on an X-ray fluorescence holography (XFH) measurement. To measure the holograms of elements lighter than Ti, we improved a cylindrical-type crystal analyzer and constructed a small C-shaped analyzer. Using the constructed C-shaped analyzer, a Ca Kα hologram of a fluorite single crystal was obtained, from which we reconstructed a clear atomic image. The XFH measurements for the K, Ca, and Sc elements become possible using the presently constructed analyzer.

  18. Use of silicon drift detectors for the detection of medium-light elements in PIXE

    International Nuclear Information System (INIS)

    Alberti, R.; Bjeoumikhov, A.; Grassi, N.; Guazzoni, C.; Klatka, T.; Longoni, A.; Quattrone, A.

    2008-01-01

    In order to fully exploit in PIXE the superior performance of silicon drift detectors especially for the detection of low- and medium-energy X-rays, avoiding in particular the negative effects of backscattered particles, we developed a custom spectrometer based on a 10 mm 2 chip with a thermoelectric Peltier cooler and home-designed front-end electronics, coupled to a weakly focusing polycapillary lens. This paper briefly describes the detector + lens assembly and reports the results of first tests carried out at an external beam line of the LABEC laboratory in Florence. Excellent energy resolution is achieved under real operating conditions in a PIXE run (measured FWHM at 1 keV is 81 eV with a count-rate of 480 cps) and also the lineshapes are very good (FW1/10M over FWHM ratio is 2.1). As a whole, our preliminary tests gave encouraging results and also helped to point out some aspects which it is worthwhile to investigate further (e.g. how X-ray peak intensity ratios may be affected by inaccurate lens alignment), in order to profit fully from such a good performance of the spectrometer

  19. Light elements burning reaction rates at stellar temperatures as deduced by the Trojan Horse measurements

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, L. [Dipartimento di Fisica e Astronomia, Università degli Studi di Catania, Catania (Italy); Spitaleri, C. [Dipartimento di Fisica e Astronomia, Università degli Studi di Catania, Catania, Italy and INFN-Laboratori Nazionali del Sud, Catania (Italy); La Cognata, M.; Palmerini, S.; Sergi, M. L. [INFN-Laboratori Nazionali del Sud, Catania (Italy); Puglia, S. M. R. [INFN-Laboratori Nazionali del Sud, Catania, Italy and Dipartimento di Fisica e Astronomia, Università degli Studi di Catania, Catania (Italy)

    2015-02-24

    Experimental nuclear astrophysics aims at determining the reaction rates for astrophysically relevant reactions at their Gamow energies. For charged-particle induced reactions, the access to these energies is usually hindered, in direct measurements, by the presence of the Coulomb barrier between the interacting particles or by electron screening effects, which make hard the determination of the bare-nucleus S(E)-factor of interest for astrophysical codes. The use of the Trojan Horse Method (THM) appears as one of the most suitable tools for investigating nuclear processes of interest for astrophysics. Here, in view of the recent TH measurements, the main destruction channels for deuterium ({sup 2}H), for the two lithium {sup 6,7}Li isotopes, for the {sup 9}Be and the one for the two boron {sup 10,11}B isotopes will be discussed.

  20. Determination of Fluorine and other light elements in Syrian teeth by PIGE and PIXE techniques

    International Nuclear Information System (INIS)

    Bakraji, E.; Ahmad, M.; Doueer, M.

    2013-01-01

    The aim of this study is to determine the concentration of Fluorine in Syrian teeth collected from Southern and Coastal regions where both populations have almost similar dietary habits and similar occupational activities. The conventional PIGE method is used to determine Fluorine concentration in human teeth since its play an important role in bony and dental tissues, the low levels of Fluorine can play positive role against teeth cariosity and an inhibitor against certain enzyme systems, where high levels can cause dental Fluorosis and bone abnormalities for the bone structure. Several cations and anions have been studied in the drinking water of studied areas in order to investigate the role played by these elements and whether fluoridation within the normal levels of worldwide standard grading. Na, Mg, P and Ca concentrations have been determined in studied samples in order to compare with earlier works. (author)

  1. The determination of light elements in heavy matrix using proton induced X-ray emission

    International Nuclear Information System (INIS)

    Levenets, V.V.; Omel'nik, A.P.; Shchur, A.A.; Chernov, A.E.; Usikov, N.P.; Zats, A.V.

    2007-01-01

    In this report the possibility of determination of light impurities in heavy matrixes is studied using proton induced X-Ray emission. The wide-band X-ray emission filter made from pyrolytic graphite was used in spectrometric scheme of experiment. The results of studying of filter features in energy range of X-ray emission from 4 to 12 keV were presented. The possibilities were examined of application of pyrolytic graphite filter to modify the X-rays spectrum for determination of iron, using characteristic emission of K-series, and hafnium, using L-series, in substances on base of zirconium (glasses, alloys etc.). It was shown, that the using of similar filter allows to reach the significant improving of metrological characteristics of analysis of mentioned impurities: the limits of detection of iron and hafnium were lowered single-order of magnitude. (authors)

  2. Models based on multichannel R-matrix theory for evaluating light element reactions

    International Nuclear Information System (INIS)

    Dodder, D.C.; Hale, G.M.; Nisley, R.A.; Witte, K.; Young, P.G.

    1975-01-01

    Multichannel R-matrix theory has been used as a basis for models for analysis and evaluation of light nuclear systems. These models have the characteristic that data predictions can be made utilizing information derived from other reactions related to the one of primary interest. Several examples are given where such an approach is valid and appropriate. (auth.)

  3. Standard error propagation in R-matrix model fitting for light elements

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Zhang Rui; Sun Yeying; Liu Tingjin

    2003-01-01

    The error propagation features with R-matrix model fitting 7 Li, 11 B and 17 O systems were researched systematically. Some laws of error propagation were revealed, an empirical formula P j = U j c / U j d = K j · S-bar · √m / √N for describing standard error propagation was established, the most likely error ranges for standard cross sections of 6 Li(n,t), 10 B(n,α0) and 10 B(n,α1) were estimated. The problem that the standard error of light nuclei standard cross sections may be too small results mainly from the R-matrix model fitting, which is not perfect. Yet R-matrix model fitting is the most reliable evaluation method for such data. The error propagation features of R-matrix model fitting for compound nucleus system of 7 Li, 11 B and 17 O has been studied systematically, some laws of error propagation are revealed, and these findings are important in solving the problem mentioned above. Furthermore, these conclusions are suitable for similar model fitting in other scientific fields. (author)

  4. Determination of light elements concentration in aerosols by X emission induced by deuteron

    International Nuclear Information System (INIS)

    Morales, J.R.; Romo, C.

    1983-01-01

    Absolute concentrations for Al, Si, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe and Cu were obtained in the range from 10 ng/m 3 to 10 4 ng/m 3 in aerosols from Santiago. A 4,2 MeV deuteron beam was used to induce characteristic X-ray emission. It was found that relative abundance of these elements is maintained for days of high and low total suspended particulates. (Author)

  5. Forensic Applications of Light-Element Stable Isotope Ratios of Ricinus communis Seeds and Ricin Preparations

    Energy Technology Data Exchange (ETDEWEB)

    Kreuzer, Helen W.; West, Jason B.; Ehleringer, James

    2013-01-01

    Seeds of the castor plant Ricinus communis, also known as castor beans, are of forensic interest because they are the source of the poison ricin. We have tested whether stable isotope ratios of castor seeds and ricin prepared by various methods can be used as a forensic signature. We collected over 300 castor seed samples from locations around the world and measured the C, N, O, and H stable isotope ratios of the whole seeds, oil, and three types of ricin preparations. Our results demonstrate that N isotope ratios can be used to correlate ricin prepared by any of these methods to source seeds. Further, stable isotope ratios distinguished >99% of crude and purified ricin protein samples in pair-wise comparison tests. Stable isotope ratios therefore constitute a valuable forensic signature for ricin preparations.

  6. Electrical resistivity of liquid iron with high concentration of light element impurities

    Science.gov (United States)

    Wagle, F.; Steinle-Neumann, G.

    2017-12-01

    The Earth's outer core mainly consists of liquid iron, enriched with several weight percent of lighter elements, such as silicon, oxygen, sulfur or carbon. Electrical resistivities of alloys of this type determine the stability of the geodynamo. Both computational and experimental results show that resistivites of Fe-based alloys deviate significantly from values of pure Fe. Using optical conductivity values computed with the Kubo-Greenwood formalism for DFT-based molecular dynamics results, we analyze the high-P and T behavior of resitivities for Fe-alloys containing various concentrations of sulfur, oxygen and silicon. As the electron mean free path length in amorphous and liquid material becomes comparable to interatomic distances at high P and T, electron scattering is expected to be dominated by the short-range order, rather than T-dependent vibrational contributions, and we describe such correlations in our results. In analogy to macroscopic porous media, we further show that resistivity of a liquid metal-nonmetal alloy is determined to first order by the resistivity of the metallic matrix and the volume fraction of non-metallic impurities.

  7. Igniting the Light Elements: The Los Alamos Thermonuclear Weapon Project, 1942-1952

    Energy Technology Data Exchange (ETDEWEB)

    Fitzpatrick, Anne C. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States)

    1999-07-01

    The American system of nuclear weapons research and development was conceived and developed not as a result of technological determinism, but by a number of individual architects who promoted the growth of this large technologically-based complex. While some of the technological artifacts of this system, such as the fission weapons used in World War II, have been the subject of many historical studies, their technical successors--fusion (or hydrogen) devices--are representative of the largely unstudied highly secret realms of nuclear weapons science and engineering. In the postwar period a small number of Los Alamos Scientific Laboratory's staff and affiliates were responsible for theoretical work on fusion weapons, yet the program was subject to both the provisions and constraints of the US Atomic Energy Commission, of which Los Alamos was a part. The Commission leadership's struggle to establish a mission for its network of laboratories, least of all to keep them operating, affected Los Alamos's leaders' decisions as to the course of weapons design and development projects. Adapting Thomas P. Hughes's ''large technological systems'' thesis, I focus on the technical, social, political, and human problems that nuclear weapons scientists faced while pursuing the thermonuclear project, demonstrating why the early American thermonuclear bomb project was an immensely complicated scientific and technological undertaking. I concentrate mainly on Los Alamos Scientific Laboratory's Theoretical, or T, Division, and its members' attempts to complete an accurate mathematical treatment of the ''Super''--the most difficult problem in physics in the postwar period--and other fusion weapon theories. Although tackling a theoretical problem, theoreticians had to address technical and engineering issues as well. I demonstrate the relative value and importance of H-bomb research over time in the postwar era to scientific, politician, and military participants in this project. I analyze how and when participants in the H-bomb project recognized both blatant and subtle problems facing the project, how scientists solved them, and the relationship this process had to official nuclear weapons policies. Consequently, I show how the practice of nuclear weapons science in the postwar period became an extremely complex, technologically-based endeavor.

  8. Accurate core ionization potentials and photoelectron kinetic energies for light elements

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, T D [Oregon State Univ., Corvallis; Shaw, Jr, R W

    1974-01-01

    By electron spectroscopy accurate values have been determined for the neon 1s ionization potential (870.312 +- .017 eV) and the neon Auger (/sup 1/D/sub 2/) kinetic energy (804.557 +- .017 eV). Using these together with the neon 2s ionization potential for calibration, 1s ionization potentials have been determined for CF/sub 4/ (C = 301.96, F = 695.57), CO/sub 2/ (C = 297.71, O = 541.32), N/sub 2/ (N = 409.93) and ionization potentials for Ar (2s = 326.37, 2p/sub /sup 3///sub 2// = 248.60, 2p/sub /sup 1///sub 2// = 250.70). These are known with an accuracy of 0.05 eV. The results are in good agreement with those of other measurements but have significantly smaller uncertainties. Comparison is made between experimental and theoretical ionization potentials. The value for neon is quite close to a recently reported theoretical value of 870.0 eV. The relativistic corrections for a cylindrical mirror analyzer, which are much smaller at low energies than would be expected from an approximate formula, are discussed.

  9. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  10. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  11. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  12. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  13. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  14. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  15. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  16. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  17. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  18. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  19. The phaco machine: analysing new technology.

    Science.gov (United States)

    Fishkind, William J

    2013-01-01

    The phaco machine is frequently overlooked as the crucial surgical instrument it is. Understanding how to set parameters is initiated by understanding fundamental concepts of machine function. This study analyses the critical concepts of partial occlusion phaco, occlusion phaco and pump technology. In addition, phaco energy categories as well as variations of phaco energy production are explored. Contemporary power modulations and pump controls allow for the enhancement of partial occlusion phacoemulsification. These significant changes in the anterior chamber dynamics produce a balanced environment for phaco; less complications; and improved patient outcomes.

  20. Nuclear analyses of the Pietroasa gold hoard

    International Nuclear Information System (INIS)

    Cojocaru, V.; Besliu, C.

    1999-01-01

    By means of nuclear analyses the concentrations of Au, Ag, Cu, Ir, Os, Pt, Co and Hg were measured in the 12 artifacts of the gold hoard discovered in 1837 at Pietroasa, Buzau country in Romania. The concentrations of the first four elements were used to compare different stylistic groups assumed by historians. Comparisons with gold nuggets from the old Dacian territory and gold Roman imperial coins were also made. A good agreement was found with the oldest hypothesis which considers that the hoard is represented by three styles appropriated mainly by the Goths. (author)

  1. An evaluation of the Olympus "Quickrate" analyser.

    Science.gov (United States)

    Williams, D G; Wood, R J; Worth, H G

    1979-02-01

    The Olympus "Quickrate", a photometer built for both kinetic and end point analysis was evaluated in this laboratory. Aspartate transaminase, lactate dehydrogenase, hydroxybutyrate dehydrogenase, creatine kinase, alkaline phosphatase and gamma glutamyl transpeptidase were measured in the kinetic mode and glucose, urea, total protein, albumin, bilirubin, calcium and iron in the end point mode. Overall, good correlation was observed with routine methodologies and the precision of the methods was acceptable. An electrical evaluation was also performed. In our hands, the instrument proved to be simple to use and gave no trouble. It should prove useful for paediatric and emergency work, and as a back up for other analysers.

  2. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  3. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  4. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  5. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  6. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  7. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  8. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  9. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  10. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  11. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  12. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  13. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  14. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  15. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  16. Abundance analyses of thirty cool carbon stars

    International Nuclear Information System (INIS)

    Utsumi, Kazuhiko

    1985-01-01

    The results were previously obtained by use of the absolute gf-values and the cosmic abundance as a standard. These gf-values were found to contain large systematic errors, and as a result, the solar photospheric abundances were revised. Our previous results, therefore, must be revised by using new gf-values, and abundance analyses are extended for as many carbon stars as possible. In conclusion, in normal cool carbon stars heavy metals are overabundant by factors of 10 - 100 and rare-earth elements are overabundant by a factor of about 10, and in J-type cool carbon stars, C 12 /C 13 ratio is smaller, C 2 and CN bands and Li 6708 are stronger than in normal cool carbon stars, and the abundances of s-process elements with respect to Fe are nearly normal. (Mori, K.)

  17. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  18. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  19. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  20. Seismic analyses of structures. 1st draft

    International Nuclear Information System (INIS)

    David, M.

    1995-01-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as response to seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration

  1. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  2. Seismic analyses of structures. 1st draft

    Energy Technology Data Exchange (ETDEWEB)

    David, M [David Consulting, Engineering and Design Office (Czech Republic)

    1995-07-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as responseto seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration.

  3. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  4. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  5. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  6. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  7. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  8. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  9. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    Silva, W R; Zanardi, M C; Formiga, J K S; Cabette, R E S; Stuchi, T J

    2013-01-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  10. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  11. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  12. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  13. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  14. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  15. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  16. Application of a low energy x-ray spectrometer to analyses of suspended air particulate matter

    International Nuclear Information System (INIS)

    Giauque, R.D.; Garrett, R.B.; Goda, L.Y.; Jaklevic, J.M.; Malone, D.F.

    1975-01-01

    A semiconductor detector x-ray spectrometer has been constructed for the analysis of elements in air particulate specimens. The excitation radiation is provided, either directly or indirectly, using a low power (40 watts) Ag anode x-ray tube. Less than 100 ng for most of the elements in the range Mg → Zr, Pb are easily detected within two 1-minute counting intervals. A calibration technique for light element analysis and an experimental method which compensates for particle size effects are discussed. (auth)

  17. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  18. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  19. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  20. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  1. Electrochemical analyses of diffusion behaviors and nucleation mechanisms for neodymium complexes in [DEME][TFSA] ionic liquid

    International Nuclear Information System (INIS)

    MATSUMIYA, Masahiko; ISHII, Mai; KAZAMA, Ryo; KAWAKAMI, Satoshi

    2014-01-01

    Highlights: • The cathodic reaction; Nd(III) + 3e − → Nd(0) was observed at −3.30 V in [DEME][TFSA]. • The diffusion coefficient of Nd(III) in [DEME][TFSA] was evaluated from semi-integral analysis. • The nucleation mechanism of Nd nuclei was altered from instantaneous to progressive nucleation. • The number density of Nd nuclei was increased as the overpotential was increased. • The electrodeposits from [DEME][TFSA] were identified Nd metal and oxide mixtures by XPS. - ABSTRACT: The electrochemical and nucleation behavior of Nd(III) in the ammonium-based ionic liquid (IL), N,N-diethyl-N-methyl-N-(2-methoxyethyl) ammonium bis(trifluoromethyl-sulfonyl) amide, [DEME] [TFSA], were investigated in this study. The cathodic reaction of Nd(III) [Nd(III) + 3e − → Nd(0)] was observed at −3.30 V vs. Ag/Ag(I) using cyclic voltammetry at 353 K. The diffusion coefficient of Nd(III) was estimated to be 1.35 ± 0.10 × 10 −13 m 2 s −1 at 353 K using semi-integral and semi-differential analyses. The initial process of Nd electrodeposition was also evaluated by chronoamperometry, indicating that the initial nucleation and growth of Nd on the Pt electrode occurred via instantaneous nucleation at −3.40 V. As the applied potential became more negative, the mechanism changed from instantaneous to progressive nucleation. The number density of Nd nuclei in the initial stage of nucleation decreased as the overpotential increased. Furthermore, the electrodeposition of Nd was carried out under the conditions of −3.40 V and −3.60 V at 353 K. SEM observations of the electrodeposits were consistent with the series of results obtained by chronoamperometry. The electrodeposits consisted mainly of Nd metal and oxide mixtures, whereas bonding with the light elements (C, F, and S) of the IL was suppressed, as demonstrated by EDX and XPS. The results suggested that sufficient dehydration and control of the water content of the electrolyte are important factors

  2. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  3. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  4. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  5. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  6. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  7. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  8. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  9. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  10. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  11. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  12. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  13. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  14. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  15. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  16. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  17. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  18. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  19. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    S. Tsai

    2005-01-01

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2

  20. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  1. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  2. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  3. System for analysing sickness absenteeism in Poland.

    Science.gov (United States)

    Indulski, J A; Szubert, Z

    1997-01-01

    The National System of Sickness Absenteeism Statistics has been functioning in Poland since 1977, as the part of the national health statistics. The system is based on a 15-percent random sample of copies of certificates of temporary incapacity for work issued by all health care units and authorised private medical practitioners. A certificate of temporary incapacity for work is received by every insured employee who is compelled to stop working due to sickness, accident, or due to the necessity to care for a sick member of his/her family. The certificate is required on the first day of sickness. Analyses of disease- and accident-related sickness absenteeism carried out each year in Poland within the statistical system lead to the main conclusions: 1. Diseases of the musculoskeletal and peripheral nervous systems accounting, when combined, for 1/3 of the total sickness absenteeism, are a major health problem of the working population in Poland. During the past five years, incapacity for work caused by these diseases in males increased 2.5 times. 2. Circulatory diseases, and arterial hypertension and ischaemic heart disease in particular (41% and 27% of sickness days, respectively), create an essential health problem among males at productive age, especially, in the 40 and older age group. Absenteeism due to these diseases has increased in males more than two times.

  4. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  5. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  6. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  7. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  8. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  9. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  10. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  11. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  12. The ABC (Analysing Biomolecular Contacts-database

    Directory of Open Access Journals (Sweden)

    Walter Peter

    2007-03-01

    Full Text Available As protein-protein interactions are one of the basic mechanisms in most cellular processes, it is desirable to understand the molecular details of protein-protein contacts and ultimately be able to predict which proteins interact. Interface areas on a protein surface that are involved in protein interactions exhibit certain characteristics. Therefore, several attempts were made to distinguish protein interactions from each other and to categorize them. One way of classification are the groups of transient and permanent interactions. Previously two of the authors analysed several properties for transient complexes such as the amino acid and secondary structure element composition and pairing preferences. Certainly, interfaces can be characterized by many more possible attributes and this is a subject of intense ongoing research. Although several freely available online databases exist that illuminate various aspects of protein-protein interactions, we decided to construct a new database collecting all desired interface features allowing for facile selection of subsets of complexes. As database-server we applied MySQL and the program logic was written in JAVA. Furthermore several class extensions and tools such as JMOL were included to visualize the interfaces and JfreeChart for the representation of diagrams and statistics. The contact data is automatically generated from standard PDB files by a tcl/tk-script running through the molecular visualization package VMD. Currently the database contains 536 interfaces extracted from 479 PDB files and it can be queried by various types of parameters. Here, we describe the database design and demonstrate its usefulness with a number of selected features.

  13. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  14. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  15. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  16. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  17. YALINA Booster subcritical assembly modeling and analyses

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Cao, Y.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Sadovich, S.

    2010-01-01

    Full text: Accurate simulation models of the YALINA Booster assembly of the Joint Institute for Power and Nuclear Research (JIPNR)-Sosny, Belarus have been developed by Argonne National Laboratory (ANL) of the USA. YALINA-Booster has coupled zones operating with fast and thermal neutron spectra, which requires a special attention in the modelling process. Three different uranium enrichments of 90%, 36% or 21% were used in the fast zone and 10% uranium enrichment was used in the thermal zone. Two of the most advanced Monte Carlo computer programs have been utilized for the ANL analyses: MCNP of the Los Alamos National Laboratory and MONK of the British Nuclear Fuel Limited and SERCO Assurance. The developed geometrical models for both computer programs modelled all the details of the YALINA Booster facility as described in the technical specifications defined in the International Atomic Energy Agency (IAEA) report without any geometrical approximation or material homogenization. Materials impurities and the measured material densities have been used in the models. The obtained results for the neutron multiplication factors calculated in criticality mode (keff) and in source mode (ksrc) with an external neutron source from the two Monte Carlo programs are very similar. Different external neutron sources have been investigated including californium, deuterium-deuterium (D-D), and deuterium-tritium (D-T) neutron sources. The spatial neutron flux profiles and the neutron spectra in the experimental channels were calculated. In addition, the kinetic parameters were defined including the effective delayed neutron fraction, the prompt neutron lifetime, and the neutron generation time. A new calculation methodology has been developed at ANL to simulate the pulsed neutron source experiments. In this methodology, the MCNP code is used to simulate the detector response from a single pulse of the external neutron source and a C code is used to superimpose the pulse until the

  18. Altools: a user friendly NGS data analyser.

    Science.gov (United States)

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  19. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  20. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  1. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  2. Forced vibration tests and simulation analyses of a nuclear reactor building. Part 2: simulation analyses

    International Nuclear Information System (INIS)

    Kuno, M.; Nakagawa, S.; Momma, T.; Naito, Y.; Niwa, M.; Motohashi, S.

    1995-01-01

    Forced vibration tests of a BWR-type reactor building. Hamaoka Unit 4, were performed. Valuable data on the dynamic characteristics of the soil-structure interaction system were obtained through the tests. Simulation analyses of the fundamental dynamic characteristics of the soil-structure system were conducted, using a basic lumped mass soil-structure model (lattice model), and strong correlation with the measured data was obtained. Furthermore, detailed simulation models were employed to investigate the effects of simultaneously induced vertical response and response of the adjacent turbine building on the lateral response of the reactor building. (author). 4 refs., 11 figs

  3. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  4. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  5. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les developpements angulaires. Enfin on fournit un substitut a la

  6. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  7. ATHENA/INTRA analyses for ITER, NSSR-2

    International Nuclear Information System (INIS)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A.

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes

  8. ATHENA/INTRA analyses for ITER, NSSR-2

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes 8 refs, 14 figs, 15 tabs

  9. Methods and procedures for shielding analyses for the SNS

    International Nuclear Information System (INIS)

    Popova, I.; Ferguson, F.; Gallmeier, F.X.; Iverson, E.; Lu, Wei

    2011-01-01

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods used for the analyses, and associated procedures and regulations are presented. Methods used to perform shielding analyses are described as well. (author)

  10. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  11. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  12. Discrete frequency identification using the HP 5451B Fourier analyser

    International Nuclear Information System (INIS)

    Holland, L.; Barry, P.

    1977-01-01

    The frequency analysis by the HP5451B discrete frequency Fourier analyser is studied. The advantages of cross correlation analysis to identify discrete frequencies in a background noise are discussed in conjuction with the elimination of aliasing and wraparound error. Discrete frequency identification is illustrated by a series of graphs giving the results of analysing 'electrical' and 'acoustical' white noise and sinusoidal signals [pt

  13. A Java Bytecode Metamodel for Composable Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet; Seidl, Martina; Zschaler, Steffen

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java; and access the program to be analyzed through libraries for manipulating intermediate code, such

  14. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop, such as b...

  15. Multipole analyses and photo-decay couplings at intermediate energies

    International Nuclear Information System (INIS)

    Workman, R.L.; Arndt, R.A.; Zhujun Li

    1992-01-01

    The authors describe the results of several multipole analyses of pion-photoproduction data to 2 GeV in the lab photon energy. Comparisons are made with previous analyses. The photo-decay couplings for the delta are examined in detail. Problems in the representation of photoproduction data are discussed, with an emphasis on the recent LEGS data. 16 refs., 4 tabs

  16. Houdbaarheid en conservering van grondwatermonsters voor anorganische analyses

    NARCIS (Netherlands)

    Cleven RFMJ; Gast LFL; Boshuis-Hilverdink ME; LAC

    1995-01-01

    The storage life and the possibilities for preservation of inorganic analyses of groundwater samples have been investigated. Groundwater samples, with and without preservation with acid, from four locations in the Netherlands have been analysed ten times over a period of three months on six

  17. Uranium price trends for use in strategy analyses

    International Nuclear Information System (INIS)

    James, R.A.

    1979-09-01

    Long-term price forecasts for mined uranium are quoted. These will be used in Ontario Hydro's nuclear fuel cycle strategy analyses. They are, of necessity, speculative. The accuracy of the forecasts is considered adequate for long-term strategy analyses, but not for other purposes. (auth)

  18. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of reference...

  19. Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster

    DEFF Research Database (Denmark)

    Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl

    En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....

  20. The role of CFD computer analyses in hydrogen safety management

    International Nuclear Information System (INIS)

    Komen, E.M.J; Visser, D.C; Roelofs, F.; Te Lintelo, J.G.T

    2014-01-01

    The risks of hydrogen release and combustion during a severe accident in a light water reactor have attracted considerable attention after the Fukushima accident in Japan. Reliable computer analyses are needed for the optimal design of hydrogen mitigation systems, like e.g. passive autocatalytic recombiners (PARs), and for the assessment of the associated residual risk of hydrogen combustion. Traditionally, so-called Lumped Parameter (LP) computer codes are being used for these purposes. In the last decade, significant progress has been made in the development, validation, and application of more detailed, three-dimensional Computational Fluid Dynamics (CFD) simulations for hydrogen safety analyses. The objective of the current paper is to address the following questions: - When are CFD computer analyses needed complementary to the traditional LP code analyses for hydrogen safety management? - What is the validation status of the CFD computer code for hydrogen distribution, mitigation, and combustion analyses? - Can CFD computer analyses nowadays be executed in practical and reliable way for full scale containments? The validation status and reliability of CFD code simulations will be illustrated by validation analyses performed for experiments executed in the PANDA, THAI, and ENACCEF facilities. (authors)

  1. Graphite analyser upgrade for the IRIS spectrometer at ISIS

    International Nuclear Information System (INIS)

    Campbell, S.I.; Telling, M.T.F.; Carlile, C.J.

    1999-01-01

    Complete text of publication follows. The pyrolytic graphite (PG) analyser bank on the IRIS high resolution inelastic spectrometer [1] at ISIS is to be upgraded. At present the analyser consists of 1350 graphite pieces (6 rows by 225 columns) cooled to 25K [2]. The new analyser array, however, will provide a three-fold increase in area and employ 4212 crystal pieces (18 rows by 234 columns). In addition, the graphite crystals will be cooled close to liquid helium temperature to further reduce thermal diffuse scattering (TDS) and improve the sensitivity of the spectrometer [2]. For an instrument such as IRIS, with its analyser in near back-scattering geometry, optical aberration and variation in the time-of-flight of the analysed neutrons is introduced as one moves out from the horizontal scattering plane. To minimise such effects, the profile of the analyser array has been redesigned. The concept behind the design of the new analyser bank and factors that effect the overall resolution of the instrument are discussed. Results of Monte Carlo simulations of the expected resolution and intensity of the complete instrument are presented and compared to the current instrument performance. (author) [1] C.J. Carlile et al, Physica B 182 (1992) 431-440.; [2] C.J. Carlile et al, Nuclear Instruments and Methods In Physics Research A 338 (1994) 78-82

  2. A vector matching method for analysing logic Petri nets

    Science.gov (United States)

    Du, YuYue; Qi, Liang; Zhou, MengChu

    2011-11-01

    Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.

  3. Systematic Derivation of Static Analyses for Software Product Lines

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Brabrand, Claus; Wasowski, Andrzej

    2014-01-01

    A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting program analyses to SPLs using abstract interpretation. Abstract interpretation...... for lifting analyses and Galois connections. We prove that for analyses developed using our method, the soundness of lifting follows by construction. Finally, we discuss approximating variability in an analysis and we derive variational data-flow equations for an example analysis, a constant propagation...

  4. Korte narrativer i analyser af beskæftigelsesindsatser

    DEFF Research Database (Denmark)

    Olesen, Søren Peter; Eskelinen, Leena

    2009-01-01

    beskæftigelsesindsatser imødekommes forskningsmetodisk med en relevant strategi for dataindsamling og analyse? og 2) Hvordan kan arbejdsløses synsvinkel relateres til effektstudier og evaluering? Vi tager udgangspunkt i to adskilte udviklingstendenser: en vending i narrativ analyse mod små fortællinger og en ny tilgang...... til evaluering kaldet relationel evaluering. Vi kombinerer disse tendenser i et begreb om korte narrativer om arbejdsidentitet som kvalitativ tilgang til analyse af arbejdslivsperspektiv og konsekvenser af beskæftigelsesindsatser set med de arbejdsløses øjne. Udgivelsesdato: december...

  5. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  6. Optical region elemental abundance analyses of B and A stars

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1984-01-01

    Abundance analyses using optical region data and fully line blanketed model atmospheres have been performed for six moderately sharplined middle to late B-type stars. The derived abundances have values similar to those of the Sun. (author)

  7. Thermodynamic and Quantum Thermodynamic Analyses of Brownian Movement

    OpenAIRE

    Gyftopoulos, Elias P.

    2006-01-01

    Thermodynamic and quantum thermodynamic analyses of Brownian movement of a solvent and a colloid passing through neutral thermodynamic equilibrium states only. It is shown that Brownian motors and E. coli do not represent Brownian movement.

  8. Methods for analysing cardiovascular studies with repeated measures

    NARCIS (Netherlands)

    Cleophas, T. J.; Zwinderman, A. H.; van Ouwerkerk, B. M.

    2009-01-01

    Background. Repeated measurements in a single subject are generally more similar than unrepeated measurements in different subjects. Unrepeated analyses of repeated data cause underestimation of the treatment effects. Objective. To review methods adequate for the analysis of cardiovascular studies

  9. Multielement trace analyses of SINQ materials by ICP-OES

    Energy Technology Data Exchange (ETDEWEB)

    Keil, R.; Schwikowski, M. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Inductively Coupled Plasma Optical Emission Spectrometry was used to analyse 70 elements in various materials used for construction of the SINQ. Detection limits for individual elements depend strongly on the matrix and had to be determined separately. (author) 1 tab.

  10. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  11. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  12. Finite element analyses for RF photoinjector gun cavities

    International Nuclear Information System (INIS)

    Marhauser, F.

    2006-01-01

    This paper details electromagnetical, thermal and structural 3D Finite Element Analyses (FEA) for normal conducting RF photoinjector gun cavities. The simulation methods are described extensively. Achieved results are presented. (orig.)

  13. Summary of Prometheus Radiation Shielding Nuclear Design Analyses , for information

    International Nuclear Information System (INIS)

    J. Stephens

    2006-01-01

    This report transmits a summary of radiation shielding nuclear design studies performed to support the Prometheus project. Together, the enclosures and references associated with this document describe NRPCT (KAPL and Bettis) shielding nuclear design analyses done for the project

  14. Book Review: Qualitative-Quantitative Analyses of Dutch and ...

    African Journals Online (AJOL)

    Abstract. Book Title: Qualitative-Quantitative Analyses of Dutch and Afrikaans Grammar and Lexicon. Book Author: Robert S. Kirsner. 2014. John Benjamins Publishing Company ISBN 9789027215772, price ZAR481.00. 239 pages ...

  15. Finite element analyses for RF photoinjector gun cavities

    Energy Technology Data Exchange (ETDEWEB)

    Marhauser, F. [Berliner Elektronenspeicherring-Gesellschaft fuer Synchrotronstrahlung mbH (BESSY), Berlin (Germany)

    2006-07-01

    This paper details electromagnetical, thermal and structural 3D Finite Element Analyses (FEA) for normal conducting RF photoinjector gun cavities. The simulation methods are described extensively. Achieved results are presented. (orig.)

  16. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  17. Selection of interest and inflation rates for infrastructure investment analyses.

    Science.gov (United States)

    2014-12-01

    The South Dakota Department of Transportation (SDDOT) uses engineering economic analyses (EEA) to : support planning, design, and construction decision-making such as project programming and planning, : pavement type selection, and the occasional val...

  18. Economic Analyses of Ware Yam Production in Orlu Agricultural ...

    African Journals Online (AJOL)

    Economic Analyses of Ware Yam Production in Orlu Agricultural Zone of Imo State. ... International Journal of Agriculture and Rural Development ... statistics, gross margin analysis, marginal analysis and multiple regression analysis. Results ...

  19. Analyse des formes et modes traditionnels de communication en ...

    African Journals Online (AJOL)

    Analyse des formes et modes traditionnels de communication en milieu rural ... se situe aussi bien au niveau de la forme verbale que non verbale du langage. ... In fact, expressions of message take place in verbal and no verbal language.

  20. Elemental abundance analyses with coadded DAO spectrograms: Pt. 5

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1988-01-01

    Elemental abundance analyses of three mercury-manganese stars were performed in a manner consistent with previous analyses of this series. A few correlations are found between the derived abundances and with the effective temperature in accordance with the expectations of radiative diffusion explanations of the derived abundances. The helium abundances are smaller than the value required to sustain the superficial helium convection zone in the atmospheres of these stars. (author)

  1. Analysing Information Systems Security In Higher Learning Institutions Of Uganda

    OpenAIRE

    Mugyenyi Raymond

    2017-01-01

    Information communication technology has increased globalisation in higher learning institution all over the world. This has been achieved through introduction of systems that ease operations related to information handling in the institutions. The paper assessed and analysed the information systems security performance status in higher learning institutions of Uganda. The existing policies that govern the information security have also been analysed together with the current status of inform...

  2. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  3. A protocol for analysing mathematics teacher educators' practices

    OpenAIRE

    Kuzle , Ana; Biehler , Rolf

    2015-01-01

    International audience; Studying practices in a teaching-learning environment, such as professional development programmes, is a complex and multi-faceted endeavour. While several frameworks exist to help researchers analyse teaching practices, none exist to analyse practices of those who organize professional development programmes, namely mathematics teacher educators. In this paper, based on theoretical as well as empirical results, we present a protocol for capturing different aspects of ...

  4. Structural analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1992-04-01

    ITER (International Thermonuclear Experimental Reactor) is intended to be an experimental thermonuclear tokamak reactor testing the basic physics performance and technologies essential to future fusion reactors. The magnet system of ITER consists essentially of 4 sub-systems, i.e. toroidal field coils (TFCs), poloidal field coils (PFCs), power supplies, and cryogenic supplies. These subsystems do not contain significant radioactivity inventories, but the large energy inventory is a potential accident initiator. The aim of the structural analyses is to prevent accidents from propagating into vacuum vessel, tritium system and cooling system, which all contain significant amounts of radioactivity. As part of design process 3 conditions are defined for PF and TF coils, at which mechanical behaviour has to be analyzed in some detail, viz: normal operating conditions, upset conditions and fault conditions. This paper describes the work carried out by ECN to create a detailed finite element model of 16 TFCs as well as results of some fault condition analyses made with the model. Due to fault conditions, either electrical or mechanical, magnetic loading of TFCs becomes abnormal and further mechanical failure of parts of the overall structure might occur (e.g. failure of coil, gravitational supports, intercoil structure). The analyses performed consist of linear elastic stress analyses and electro-magneto-structural analyses (coupled field analyses). 8 refs.; 5 figs.; 5 tabs

  5. Selection, rejection and optimisation of pyrolytic graphite (PG) crystal analysers for use on the new IRIS graphite analyser bank

    International Nuclear Information System (INIS)

    Marshall, P.J.; Sivia, D.S.; Adams, M.A.; Telling, M.T.F.

    2000-01-01

    This report discusses design problems incurred by equipping the IRIS high-resolution inelastic spectrometer at the ISIS pulsed neutron source, UK with a new 4212 piece pyrolytic graphite crystal analyser array. Of the 4212 graphite pieces required, approximately 2500 will be newly purchased PG crystals with the remainder comprising of the currently installed graphite analysers. The quality of the new analyser pieces, with respect to manufacturing specifications, is assessed, as is the optimum arrangement of new PG pieces amongst old to circumvent degradation of the spectrometer's current angular resolution. Techniques employed to achieve these criteria include accurate calliper measurements, FORTRAN programming and statistical analysis. (author)

  6. Accelerated safety analyses - structural analyses Phase I - structural sensitivity evaluation of single- and double-shell waste storage tanks

    International Nuclear Information System (INIS)

    Becker, D.L.

    1994-11-01

    Accelerated Safety Analyses - Phase I (ASA-Phase I) have been conducted to assess the appropriateness of existing tank farm operational controls and/or limits as now stipulated in the Operational Safety Requirements (OSRs) and Operating Specification Documents, and to establish a technical basis for the waste tank operating safety envelope. Structural sensitivity analyses were performed to assess the response of the different waste tank configurations to variations in loading conditions, uncertainties in loading parameters, and uncertainties in material characteristics. Extensive documentation of the sensitivity analyses conducted and results obtained are provided in the detailed ASA-Phase I report, Structural Sensitivity Evaluation of Single- and Double-Shell Waste Tanks for Accelerated Safety Analysis - Phase I. This document provides a summary of the accelerated safety analyses sensitivity evaluations and the resulting findings

  7. Statistical mechanics of light elements at high pressure. VII. A perturbative free energy for arbitrary mixtures of H and He

    International Nuclear Information System (INIS)

    Hubbard, W.B.; Dewitt, H.E.

    1985-01-01

    A model free energy is presented which accurately represents results from 45 high-precision Monte Carlo calculations of the thermodynamics of hydrogen-helium mixtures at pressures of astrophysical and planetophysical interest. The free energy is calculated using free-electron perturbation theory (dielectric function theory), and is an extension of the expression given in an earlier paper in this series. However, it fits the Monte Carlo results more accurately, and is valid for the full range of compositions from pure hydrogen to pure helium. Using the new free energy, the phase diagram of mixtures of liquid metallic hydrogen and helium is calculated and compared with earlier results. Sample results for mixing volumes are also presented, and the new free energy expression is used to compute a theoretical Jovian adiabat and compare the adiabat with results from three-dimensional Thomas-Fermi-Dirac theory. The present theory gives slightly higher densities at pressures of about 10 megabars. 20 references

  8. Statistical mechanics of light elements at high pressure. VII - A perturbative free energy for arbitrary mixtures of H and He

    Science.gov (United States)

    Hubbard, W. B.; Dewitt, H. E.

    1985-01-01

    A model free energy is presented which accurately represents results from 45 high-precision Monte Carlo calculations of the thermodynamics of hydrogen-helium mixtures at pressures of astrophysical and planetophysical interest. The free energy is calculated using free-electron perturbation theory (dielectric function theory), and is an extension of the expression given in an earlier paper in this series. However, it fits the Monte Carlo results more accurately, and is valid for the full range of compositions from pure hydrogen to pure helium. Using the new free energy, the phase diagram of mixtures of liquid metallic hydrogen and helium is calculated and compared with earlier results. Sample results for mixing volumes are also presented, and the new free energy expression is used to compute a theoretical Jovian adiabat and compare the adiabat with results from three-dimensional Thomas-Fermi-Dirac theory. The present theory gives slightly higher densities at pressures of about 10 megabars.

  9. Particle-induced X-ray emission: thick-target analysis of inorganic materials in the determination of light elements

    International Nuclear Information System (INIS)

    Perez-Arantegui, J.; Castillo, J.R.; Querre, G.

    1994-01-01

    Particle-induced X-ray emission (PIXE) has been applied to the analysis of inorganic materials to determine some elements with Z < 27: Na, Mg, Al, Si, K, Ca, Ti, Mn and Fe, in thick-target analysis. A PIXE method has been developed for the analysis of geological materials, ceramics and pottery. Work has been carried out with an ion beam analytical system, using a low particle beam energy. Relative sensitivity, detection limits, reproducibility and accuracy of the method were calculated based on the analysis of geological standard materials (river sediments, argillaceous limestone, basalt, diorite and granite). Analysis using PIXE offers a number of advantages, such as short analysis time, multi-elemental and nondestructive determinations, and the results are similar to those obtained with other instrumental techniques of analysis. (Author)

  10. Detection of X-ray fluorescence of light elements by electron counting in a low-pressure gaseous electron multiplier

    International Nuclear Information System (INIS)

    Pansky, A.; Breskin, A.; Chechik, R.; Malamud, G.

    1992-12-01

    Ionization electrons deposited by soft X-rays in a low pressure (10 Torr) gas medium are efficiently counted by a multistage electron multiplier, providing an accurate measurement of the X-ray photon energy. Energy resolution of 56-28% FWHM were measured for X-rays of 110-676 eV, recording electrical induced charges or visible photons emitted during the avalanche process. It is demonstrated that a combined analysis of the number of electron trail length of an event, provides a powerful and competitive way of resolving ultra soft X-rays. We present the experimental technique, discuss the advantages and limitations of the Primary Electron Counter, and suggest ways to improve its performances. (authors)

  11. The SNS target station preliminary Title I shielding analyses

    International Nuclear Information System (INIS)

    Johnson, J.O.; Santoro, R.T.; Lillie, R.A.; Barnes, J.M.; McNeilly, G.S.

    2000-01-01

    The Department of Energy (DOE) has given the Spallation Neutron Source (SNS) project approval to begin Title I design of the proposed facility to be built at Oak Ridge National Laboratory (ORNL). During the conceptual design phase of the SNS project, the target station bulk-biological shield was characterized and the activation of the major targets station components was calculated. Shielding requirements were assessed with respect to weight, space, and dose-rate constraints for operating, shut-down, and accident conditions utilizing the SNS shield design criteria, DOE Order 5480.25, and requirements specified in 10 CFR 835. Since completion of the conceptual design phase, there have been major design changes to the target station as a result of the initial shielding and activation analyses, modifications brought about due to engineering concerns, and feedback from numerous external review committees. These design changes have impacted the results of the conceptual design analyses, and consequently, have required a re-investigation of the new design. Furthermore, the conceptual design shielding analysis did not address many of the details associated with the engineering design of the target station. In this paper, some of the proposed SNS target station preliminary Title I shielding design analyses will be presented. The SNS facility (with emphasis on the target station), shielding design requirements, calculational strategy, and source terms used in the analyses will be described. Preliminary results and conclusions, along with recommendations for additional analyses, will also be presented. (author)

  12. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.

    2013-01-01

    Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551

  13. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Seismic risk analyses in the German Risk Study, phase B

    International Nuclear Information System (INIS)

    Hosser, D.; Liemersdorf, H.

    1991-01-01

    The paper discusses some aspects of the seismic risk part of the German Risk Study for Nuclear Power Plants, Phase B. First simplified analyses in Phase A of the study allowed only a rough classification of structures and systems of the PWR reference plant according to their seismic risk contribution. These studies were extended in Phase B using improved models for the dynamic analyses of buildings, structures and components as well as for the probabilistic analyses of seismic loading, failure probabilities and event trees. The methodology of deriving probabilistic seismic load descriptions is explained and compared with the methods in Phase A of the study and in other studies. Some details of the linear and nonlinear dynamic analyses of structures are reported in order to demonstrate the influence of different assumptions for material behaviour and failure criteria. The probabilistic structural and event tree analyses are discussed with respect to distribution assumptions, acceptable simplifications and model uncertainties. Some results for the PWR reference plant are given. (orig.)

  15. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  16. [Methods, challenges and opportunities for big data analyses of microbiome].

    Science.gov (United States)

    Sheng, Hua-Fang; Zhou, Hong-Wei

    2015-07-01

    Microbiome is a novel research field related with a variety of chronic inflamatory diseases. Technically, there are two major approaches to analysis of microbiome: metataxonome by sequencing the 16S rRNA variable tags, and metagenome by shot-gun sequencing of the total microbial (mainly bacterial) genome mixture. The 16S rRNA sequencing analyses pipeline includes sequence quality control, diversity analyses, taxonomy and statistics; metagenome analyses further includes gene annotation and functional analyses. With the development of the sequencing techniques, the cost of sequencing will decrease, and big data analyses will become the central task. Data standardization, accumulation, modeling and disease prediction are crucial for future exploit of these data. Meanwhile, the information property in these data, and the functional verification with culture-dependent and culture-independent experiments remain the focus in future research. Studies of human microbiome will bring a better understanding of the relations between the human body and the microbiome, especially in the context of disease diagnosis and therapy, which promise rich research opportunities.

  17. Performance and Vibration Analyses of Lift-Offset Helicopters

    Directory of Open Access Journals (Sweden)

    Jeong-In Go

    2017-01-01

    Full Text Available A validation study on the performance and vibration analyses of the XH-59A compound helicopter is conducted to establish techniques for the comprehensive analysis of lift-offset compound helicopters. This study considers the XH-59A lift-offset compound helicopter using a rigid coaxial rotor system as a verification model. CAMRAD II (Comprehensive Analytical Method of Rotorcraft Aerodynamics and Dynamics II, a comprehensive analysis code, is used as a tool for the performance, vibration, and loads analyses. A general free wake model, which is a more sophisticated wake model than other wake models, is used to obtain good results for the comprehensive analysis. Performance analyses of the XH-59A helicopter with and without auxiliary propulsion are conducted in various flight conditions. In addition, vibration analyses of the XH-59A compound helicopter configuration are conducted in the forward flight condition. The present comprehensive analysis results are in good agreement with the flight test and previous analyses. Therefore, techniques for the comprehensive analysis of lift-offset compound helicopters are appropriately established. Furthermore, the rotor lifts are calculated for the XH-59A lift-offset compound helicopter in the forward flight condition to investigate the airloads characteristics of the ABC™ (Advancing Blade Concept rotor.

  18. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  19. HLA region excluded by linkage analyses of early onset periodontitis

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.; Wang, S.; Lopez, N.

    1994-09-01

    Previous studies suggested that HLA genes may influence susceptibility to early-onset periodontitis (EOP). Segregation analyses indicate that EOP may be due to a single major gene. We conducted linkage analyses to assess possible HLA effects on EOP. Fifty families with two or more close relatives affected by EOP were ascertained in Virginia and Chile. A microsatellite polymorphism within the HLA region (at the tumor necrosis factor beta locus) was typed using PCR. Linkage analyses used a donimant model most strongly supported by previous studies. Assuming locus homogeneity, our results exclude a susceptibility gene within 10 cM on either side of our marker locus. This encompasses all of the HLA region. Analyses assuming alternative models gave qualitatively similar results. Allowing for locus heterogeneity, our data still provide no support for HLA-region involvement. However, our data do not statistically exclude (LOD <-2.0) hypotheses of disease-locus heterogeneity, including models where up to half of our families could contain an EOP disease gene located in the HLA region. This is due to the limited power of even our relatively large collection of families and the inherent difficulties of mapping genes for disorders that have complex and heterogeneous etiologies. Additional statistical analyses, recruitment of families, and typing of flanking DNA markers are planned to more conclusively address these issues with respect to the HLA region and other candidate locations in the human genome. Additional results for markers covering most of the human genome will also be presented.

  20. Microcomputer-controlled thermoluminescent analyser IJS MR-200

    International Nuclear Information System (INIS)

    Mihelic, M.; Miklavzic, U.; Rupnik, Z.; Satalic, P.; Spreizer, F.; Zerovnik, I.

    1985-01-01

    Performances and concept of the multipurpose, microcomputer-controlled thermoluminescent analyser, designed for use in laboratory work TL dosemeters as well as for routine dose readings in the range from ecological to accident doses is described. The main features of the analyser are: time-linear sampling, digitalisation, storing, and subsequent displaying on the monitor time scale of the glow and and temperature curve of the TL material; digital stabilization, control and diagnostic of the analog unit; ability of storing 7 different 8-parametric heating programs; ability of storing 15 evaluation programs defined by 2 or 4 parameters and 3 different algorithms (altogether 5 types of evaluations). Analyser has several features intended for routine work: 9 function keys and possibilities of file forming on cassette or display disc, of dose calculation and averaging, of printing reports with names, and possibility of additional programming in Basic. (author)

  1. A computer program for multiple decrement life table analyses.

    Science.gov (United States)

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  2. Conducting qualitative research in mental health: Thematic and content analyses.

    Science.gov (United States)

    Crowe, Marie; Inder, Maree; Porter, Richard

    2015-07-01

    The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  3. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Science.gov (United States)

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  4. PWR plant transient analyses using TRAC-PF1

    International Nuclear Information System (INIS)

    Ireland, J.R.; Boyack, B.E.

    1984-01-01

    This paper describes some of the pressurized water reactor (PWR) transient analyses performed at Los Alamos for the US Nuclear Regulatory Commission using the Transient Reactor Analysis Code (TRAC-PF1). Many of the transient analyses performed directly address current PWR safety issues. Included in this paper are examples of two safety issues addressed by TRAC-PF1. These examples are pressurized thermal shock (PTS) and feed-and-bleed cooling for Oconee-1. The calculations performed were plant specific in that details of both the primary and secondary sides were modeled in addition to models of the plant integrated control systems. The results of these analyses show that for these two transients, the reactor cores remained covered and cooled at all times posing no real threat to the reactor system nor to the public

  5. Finite element analyses of a linear-accelerator electron gun

    Science.gov (United States)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-02-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  6. Finite element analyses of a linear-accelerator electron gun

    Energy Technology Data Exchange (ETDEWEB)

    Iqbal, M., E-mail: muniqbal.chep@pu.edu.pk, E-mail: muniqbal@ihep.ac.cn [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Wasy, A. [Department of Mechanical Engineering, Changwon National University, Changwon 641773 (Korea, Republic of); Islam, G. U. [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Zhou, Z. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China)

    2014-02-15

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  7. Finite element analyses of a linear-accelerator electron gun

    International Nuclear Information System (INIS)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-01-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator

  8. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  9. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Directory of Open Access Journals (Sweden)

    Lucy Lim

    2016-01-01

    Full Text Available Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices.

  10. Scenario evolution: Interaction between event tree construction and numerical analyses

    International Nuclear Information System (INIS)

    Barr, G.E.; Barnard, R.W.; Dockery, H.A.; Dunn, E.; MacIntyre, A.T.

    1990-01-01

    Construction of well-posed scenarios for the range of conditions possible at any proposed repository site is a critical first step to assessing total system performance. Event tree construction is the method that is being used to develop potential failure scenarios for the proposed nuclear waste repository at Yucca Mountain. An event tree begins with an initial event or condition. Subsequent events are listed in a sequence, leading eventually to release of radionuclides to the accessible environment. Ensuring the validity of the scenarios requires iteration between problems constructed using scenarios contained in the event tree sequence, experimental results, and numerical analyses. Details not adequately captured within the tree initially may become more apparent as a result of analyses. To illustrate this process, the authors discuss the iterations used to develop numerical analyses for PACE-90 (Performance Assessment Calculational Exercises) using basaltic igneous activity and human-intrusion event trees

  11. Scenario evolution: Interaction between event tree construction and numerical analyses

    International Nuclear Information System (INIS)

    Barr, G.E.; Barnard, R.W.; Dockery, H.A.; Dunn, E.; MacIntyre, A.T.

    1991-01-01

    Construction of well-posed scenarios for the range of conditions possible at any proposed repository site is a critical first step to assessing total system performance. Even tree construction is the method that is being used to develop potential failure scenarios for the proposed nuclear waste repository at Yucca Mountain. An event tree begins with an initial event or condition. Subsequent events are listed in a sequence, leading eventually to release of radionuclides to the accessible environment. Ensuring the validity of the scenarios requires iteration between problems constructed using scenarios contained in the event tree sequence, experimental results, and numerical analyses. Details not adequately captured within the tree initially may become more apparent as a result of analyses. To illustrate this process, we discuss the iterations used to develop numerical analyses for PACE-90 using basaltic igneous activity and human-intrusion event trees

  12. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  13. Financial relationships in economic analyses of targeted therapies in oncology.

    Science.gov (United States)

    Valachis, Antonis; Polyzos, Nikolaos P; Nearchou, Andreas; Lind, Pehr; Mauri, Davide

    2012-04-20

    A potential financial relationship between investigators and pharmaceutical manufacturers has been associated with an increased likelihood of reporting favorable conclusions about a sponsor's proprietary agent in pharmacoeconomic studies. The purpose of this study is to investigate whether there is an association between financial relationships and outcome in economic analyses of new targeted therapies in oncology. We searched PubMed (last update June 2011) for economic analyses of targeted therapies (including monoclonal antibodies, tyrosine-kinase inhibitors, and mammalian target of rapamycin inhibitors) in oncology. The trials were qualitatively rated regarding the cost assessment as favorable, neutral, or unfavorable on the basis of prespecified criteria. Overall, 81 eligible studies were identified. Economic analyses that were funded by pharmaceutical companies were more likely to report favorable qualitative cost estimates (28 [82%] of 34 v 21 [45%] of 47; P = .003). The presence of an author affiliated with manufacturer was not associated with study outcome. Furthermore, if only studies including a conflict of interest statement were included (66 of 81), studies that reported any financial relationship with manufacturers (author affiliation and/or funding and/or other financial relationship) were more likely to report favorable results of targeted therapies compared with studies without financial relationship (32 [71%] of 45 v nine [43%] of 21; P = .025). Our study reveals a potential threat for industry-related bias in economic analyses of targeted therapies in oncology in favor of analyses with financial relationships between authors and manufacturers. A more balanced funding of economic analyses from other sources may allow greater confidence in the interpretation of their results.

  14. Publication bias in dermatology systematic reviews and meta-analyses.

    Science.gov (United States)

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  15. Experimental technique of stress analyses by neutron diffraction

    International Nuclear Information System (INIS)

    Sun, Guangai; Chen, Bo; Huang, Chaoqiang

    2009-09-01

    The structures and main components of neutron diffraction stress analyses spectrometer, SALSA, as well as functions and parameters of each components are presented. The technical characteristic and structure parameters of SALSA are described. Based on these aspects, the choice of gauge volume, method of positioning sample, determination of diffraction plane and measurement of zero stress do are discussed. Combined with the practical experiments, the basic experimental measurement and the related settings are introduced, including the adjustments of components, pattern scattering, data recording and checking etc. The above can be an instruction for stress analyses experiments by neutron diffraction and neutron stress spectrometer construction. (authors)

  16. [The maintenance of automatic analysers and associated documentation].

    Science.gov (United States)

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  17. Designing and recasting LHC analyses with MadAnalysis 5

    CERN Document Server

    Conte, Eric; Fuks, Benjamin; Wymant, Chris

    2014-01-01

    We present an extension of the expert mode of the MadAnalysis 5 program dedicated to the design or reinterpretation of high-energy physics collider analyses. We detail the predefined classes, functions and methods available to the user and emphasize the most recent developments. The latter include the possible definition of multiple sub-analyses and a novel user-friendly treatment for the selection criteria. We illustrate this approach by two concrete examples: a CMS search for supersymmetric partners of the top quark and a phenomenological analysis targeting hadronically decaying monotop systems.

  18. Nuclear power plants: Results of recent safety analyses

    International Nuclear Information System (INIS)

    Steinmetz, E.

    1987-01-01

    The contributions deal with the problems posed by low radiation doses, with the information currently available from analyses of the Chernobyl reactor accident, and with risk assessments in connection with nuclear power plant accidents. Other points of interest include latest results on fission product release from reactor core or reactor building, advanced atmospheric dispersion models for incident and accident analyses, reliability studies on safety systems, and assessment of fire hazard in nuclear installations. The various contributions are found as separate entries in the database. (DG) [de

  19. Economical analyses of construction of a biomass boiler house

    International Nuclear Information System (INIS)

    Normak, A.

    2002-01-01

    To reduce the energy costs we can use cheaper fuel to fire our boiler. One of the cheapest fuels is wood biomass. It is very actual issue how to use cheaper wood biomass in heat generation to decrease energy costs and to increase biomass share in our energy balance. Before we decide to build biomass boiler house it is recommendable to analyse the economical situation and work out most profitable, efficient, reliable and ecological boiler plant design on particular conditions. The best way to perform the analyses is to use the economical model presented. It saves our time and gives objective evaluation to the project. (author)

  20. Certification of a uranium dioxide reference material for chemical analyses

    International Nuclear Information System (INIS)

    Le Duigou, Y.

    1984-01-01

    This report, issued by the Central Bureau for Nuclear Measurements (CBNM), describes the characterization of a uranium dioxide reference material with accurately determined uranium mass fraction for chemical analyses. The preparation, conditioning, homogeneity tests and the analyses performed on this material are described in Annex 1. The evaluation of the individual impurity results, total of impurities and uranium mass fraction are given in Annex 2. Information on a direct determination of uranium by titration is given in Annex 3. The uranium mass fraction (881.34+-0.13) g.kg -1 calculated in Annex 2 is given on the certificate

  1. Design and manufacture of TL analyser by using the microcomputer

    International Nuclear Information System (INIS)

    Doh, Sih Hong; Woo, Chong Ho

    1986-01-01

    This paper describes the design of the thermoluminescence analyser using microcomputer. TL analyser is designed to perform the three step heat treatment, such as pre-read heating, readout procedure and post-heating (or pre-irradiation ) anneal. We used a 12-bit A/D converter to get the precise measurement and the phase control method to control the heating temperature. Since the used Apple II microcomputer is cheap and popular, it is possible to design the economical system. Experimental results showed the successful operation with flexibility. The error of temperature control was less than ± 0.2% of the expected value. (Author)

  2. VIPRE modeling of VVER-1000 reactor core for DNB analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Y.; Nguyen, Q. [Westinghouse Electric Corporation, Pittsburgh, PA (United States); Cizek, J. [Nuclear Research Institute, Prague, (Czech Republic)

    1995-09-01

    Based on the one-pass modeling approach, the hot channels and the VVER-1000 reactor core can be modeled in 30 channels for DNB analyses using the VIPRE-01/MOD02 (VIPRE) code (VIPRE is owned by Electric Power Research Institute, Palo Alto, California). The VIPRE one-pass model does not compromise any accuracy in the hot channel local fluid conditions. Extensive qualifications include sensitivity studies of radial noding and crossflow parameters and comparisons with the results from THINC and CALOPEA subchannel codes. The qualifications confirm that the VIPRE code with the Westinghouse modeling method provides good computational performance and accuracy for VVER-1000 DNB analyses.

  3. RELAP5 analyses and support of Oconee-1 PTS studies

    International Nuclear Information System (INIS)

    Charlton, T.R.

    1983-01-01

    The integrity of a reactor vessel during a severe overcooling transient with primary system pressurization is a current safety concern and has been identified as an Unresolved Safety Issue(USI) A-49 by the US Nuclear Regulatory Commission (NRC). Resolution of USI A-49, denoted as Pressurized Thermal Shock (PTS), is being examined by the US NRC sponsored PTS integration study. In support of this study, the Idaho National Engineering Laboratory (INEL) has performed RELAP5/MOD1.5 thermal-hydraulic analyses of selected overcooling transients. These transient analyses were performed for the Oconee-1 pressurized water reactor (PWR), which is Babcock and Wilcox designed nuclear steam supply system

  4. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  5. Cost-benefit analyses for the development of magma power

    International Nuclear Information System (INIS)

    Haraden, John

    1992-01-01

    Magma power is the potential generation of electricity from shallow magma bodies in the crust of the Earth. Considerable uncertainty still surrounds the development of magma power, but most of that uncertainty may be eliminated by drilling the first deep magma well. The uncertainty presents no serious impediments to the private drilling of the well. For reasons unrelated to the uncertainty, there may be no private drilling and there may be justification for public drilling. In this paper, we present cost-benefit analyses for private and public drilling of the well. Both analyses indicate there is incentive for drilling. (Author)

  6. The development of an on-line gold analyser

    International Nuclear Information System (INIS)

    Robert, R.V.D.; Ormrod, G.T.W.

    1982-01-01

    An on-line analyser to monitor the gold in solutions from the carbon-in-pulp process is described. The automatic system is based on the delivery of filtered samples of the solutions to a distribution valve for measurement by flameless atomic-absorption spectrophotometry. The samples is introduced by the aerosol-deposition method. Operation of the analyser on a pilot plant and on a full-scale carbon-in-pulp plant has shown that the system is economically feasible and capable of providing a continuous indication of the efficiency of the extraction process

  7. Design basis event consequence analyses for the Yucca Mountain project

    International Nuclear Information System (INIS)

    Orvis, D.D.; Haas, M.N.; Martin, J.H.

    1997-01-01

    Design basis event (DBE) definition and analysis is an ongoing and integrated activity among the design and analysis groups of the Yucca Mountain Project (YMP). DBE's are those that potentially lead to breach of the waste package and waste form (e.g., spent fuel rods) with consequent release of radionuclides to the environment. A Preliminary Hazards Analysis (PHA) provided a systematic screening of external and internal events that were candidate DBE's that will be subjected to analyses for radiological consequences. As preparation, pilot consequence analyses for the repository subsurface and surface facilities have been performed to define the methodology, data requirements, and applicable regulatory limits

  8. Detection of defects of Kenaf/Epoxy by Thermography Analyses

    International Nuclear Information System (INIS)

    Suriani, M J; Ali, Aidi; Sapuan, S M; Khalina, A; Abdullah, S

    2012-01-01

    There are quite a few defects can occur due to manufacturing of the composites such as voids, resin-rich zones, pockets of undispersed cross-linker, misaligned fibres and regions where resin has poorly wetted the fibres. Such defect can reduce the mechanical properties as well mechanical performance of the structure and thus must be determine. In this study, the defect of Kenaf/epoxy reinforced composite materials has been determined by thermography analyses and mechanical properties testing of the composites have been done by tensile test. 95% of the thermography analyses have proved that the defects occur in the composite has reduced the mechanical properties of the specimens.

  9. Using Microsoft Office Excel 2007 to conduct generalized matching analyses.

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.

  10. USING MICROSOFT OFFICE EXCEL® 2007 TO CONDUCT GENERALIZED MATCHING ANALYSES

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law. PMID:20514196

  11. Preserving and reusing high-energy-physics data analyses

    CERN Document Server

    Simko, Tibor; Dasler, Robin; Fokianos, Pamfilos; Kuncar, Jiri; Lavasa, Artemis; Mattmann, Annemarie; Rodriguez, Diego; Trzcinska, Anna; Tsanaktsidis, Ioannis

    2017-01-01

    The revalidation, reuse and reinterpretation of data analyses require having access to the original virtual environments, datasets and software that was used to produce the original scientific result. The CERN Analysis Preservation pilot project is developing a set of tools that support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. In this talk, we shall notably focus on the aspects of reusing a preserved analysis. We describe a system that permits to instantiate the preserved analysis workflow on the computing cloud, paving the way to allowing researchers to revalidate and reinterpret research data even many years after the original publication.

  12. Race, Gender, and Reseacher Positionality Analysed Through Memory Work

    DEFF Research Database (Denmark)

    Andreassen, Rikke; Myong, Lene

    2017-01-01

    Drawing upon feminist standpoint theory and memory work, the authors analyse racial privilege by investigating their own racialized and gendered subjectifications as academic researchers. By looking at their own experiences within academia, they show how authority and agency are contingent upon...

  13. Effects of GPS sampling intensity on home range analyses

    Science.gov (United States)

    Jeffrey J. Kolodzinski; Lawrence V. Tannenbaum; David A. Osborn; Mark C. Conner; W. Mark Ford; Karl V. Miller

    2010-01-01

    The two most common methods for determining home ranges, minimum convex polygon (MCP) and kernel analyses, can be affected by sampling intensity. Despite prior research, it remains unclear how high-intensity sampling regimes affect home range estimations. We used datasets from 14 GPS-collared, white-tailed deer (Odocoileus virginianus) to describe...

  14. Prenominal and postnominal reduced relative clauses: arguments against unitary analyses

    NARCIS (Netherlands)

    Sleeman, P.

    2007-01-01

    These last years, several analyses have been proposed in which prenominal and postnominal reduced relatives are merged in the same position. Kayne (1994) claims that both types of reduced relative clauses are the complement of the determiner. More recently, Cinque (2005) has proposed that both types

  15. Optical region elemental abundance analyses of B and A stars

    International Nuclear Information System (INIS)

    Adelman, S.J.; Young, J.M.; Baldwin, H.E.

    1984-01-01

    Abundance analyses using optical region data and fully line blanketed model atmospheres have been performed for two sharp-lined hot Am stars o Pegasi and σ Aquarii and for the sharp-lined marginally peculiar A star v Cancri. The derived abundances exhibit definite anomalies compared with those of normal B-type stars and the Sun. (author)

  16. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  17. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2013-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  18. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2012-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  19. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author)

  20. Laser Beam Caustic Measurement with Focal Spot Analyser

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove; Gong, Hui; Bagger, Claus

    2005-01-01

    In industrial applications of high power CO2-lasers the caustic characteristics of the laser beam have great effects on the performance of the lasers. A welldefined high intense focused spot is essential for reliable production results. This paper presents a focal spot analyser that is developed...

  1. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  2. Multitrait-Multimethod Analyses of Two Self-Concept Instruments.

    Science.gov (United States)

    Marsh, Herbert W.; Smith, Ian D.

    1982-01-01

    The multidimensionality of self-concept and the use of factor analysis in the development of self-concept instruments are supported in multitrait-multimethod analyses of the Sears and Coopersmith instruments. Convergent validity and discriminate validity of subscales in factor analysis and multitrait-multimethod analysis of longitudinal data are…

  3. Persuading Collaboration: Analysing Persuasion in Online Collaboration Projects

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    In this paper we propose that online collaborative production sites can be fruitfully analysed in terms of the general theoretical framework of Persuasive Design. OpenStreetMap and The Pirate Bay are used as examples of collaborative production sites. Results of a quantitative analysis of persuas...

  4. Global post-Kyoto scenario analyses at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Kypreos, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs.

  5. Fundamental issues in finite element analyses of localization of deformation

    NARCIS (Netherlands)

    Borst, de R.; Sluys, L.J.; Mühlhaus, H.-B.; Pamin, J.

    1993-01-01

    Classical continuum models, i.e. continuum models that do not incorporate an internal length scale, suffer from excessive mesh dependence when strain-softening models are used in numerical analyses and cannot reproduce the size effect commonly observed in quasi-brittle failure. In this contribution

  6. Installation and performance evaluation of an indigenous surface area analyser

    International Nuclear Information System (INIS)

    Pillai, S.N.; Solapurkar, M.N.; Venkatesan, V.; Prakash, A.; Khan, K.B.; Kumar, Arun; Prasad, R.S.

    2014-01-01

    An indigenously available surface area analyser was installed inside glove box and checked for its performance by analyzing uranium oxide and thorium oxide powders at RMD. The unit has been made ready for analysis of Plutonium oxide powders after incorporating several important features. (author)

  7. Analysing the performance of dynamic multi-objective optimisation algorithms

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available and the goal of the algorithm is to track a set of tradeoff solutions over time. Analysing the performance of a dynamic multi-objective optimisation algorithm (DMOA) is not a trivial task. For each environment (before a change occurs) the DMOA has to find a set...

  8. Application of digital image correlation method for analysing crack ...

    Indian Academy of Sciences (India)

    centrated strain by imitating the treatment of micro-cracks using the finite element ... water and moisture to penetrate the concrete leading to serious rust of the ... The correlations among various grey values of digital images are analysed for ...

  9. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Ihekwaba, Adoha

    2007-01-01

    A. Ihekwaba, R. Mardare. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems. Case study: NFkB system. In Proc. of International Conference of Computational Methods in Sciences and Engineering (ICCMSE), American Institute of Physics, AIP Proceedings, N 2...

  10. Nuclear Analyses of Indian LLCB Test Blanket System in ITER

    Science.gov (United States)

    Swami, H. L.; Shaw, A. K.; Danani, C.; Chaudhuri, Paritosh

    2017-04-01

    Heading towards the Nuclear Fusion Reactor Program, India is developing Lead Lithium Ceramic Breeder (LLCB) tritium breeding blanket for its future fusion Reactor. A mock-up of the LLCB blanket is proposed to be tested in ITER equatorial port no.2, to ensure the overall performance of blanket in reactor relevant nuclear fusion environment. Nuclear analyses play an important role in LLCB Test Blanket System design & development. It is required for tritium breeding estimation, thermal-hydraulic design, coolants process design, radioactive waste management, equipment maintenance & replacement strategies and nuclear safety. The nuclear behaviour of LLCB test blanket module in ITER is predicated in terms of nuclear responses such as tritium production, nuclear heating, neutron fluxes and radiation damages. Radiation shielding capability of LLCB TBS inside and outside bio-shield was also assessed to fulfill ITER shielding requirements. In order to supports the rad-waste and safety assessment, nuclear activation analyses were carried out and radioactivity data were generated for LLCB TBS components. Nuclear analyses of LLCB TBS are performed using ITER recommended nuclear analyses codes (i.e. MCNP, EASY), nuclear cross section data libraries (i.e. FENDL 2.1, EAF) and neutronic model (ITER C-lite v.l). The paper describes a comprehensive nuclear performance of LLCB TBS in ITER.

  11. A review of bioinformatic methods for forensic DNA analyses.

    Science.gov (United States)

    Liu, Yao-Yuan; Harbison, SallyAnn

    2018-03-01

    Short tandem repeats, single nucleotide polymorphisms, and whole mitochondrial analyses are three classes of markers which will play an important role in the future of forensic DNA typing. The arrival of massively parallel sequencing platforms in forensic science reveals new information such as insights into the complexity and variability of the markers that were previously unseen, along with amounts of data too immense for analyses by manual means. Along with the sequencing chemistries employed, bioinformatic methods are required to process and interpret this new and extensive data. As more is learnt about the use of these new technologies for forensic applications, development and standardization of efficient, favourable tools for each stage of data processing is being carried out, and faster, more accurate methods that improve on the original approaches have been developed. As forensic laboratories search for the optimal pipeline of tools, sequencer manufacturers have incorporated pipelines into sequencer software to make analyses convenient. This review explores the current state of bioinformatic methods and tools used for the analyses of forensic markers sequenced on the massively parallel sequencing (MPS) platforms currently most widely used. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Morphometric analyses of the river basins in Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Iyer, S.D.; Wagle, B.G.

    Morphometric analyses of seven river basins in Goa, India have been carried out. The linear and areal aspects of these basins are reported here. The plots of stream order versus stream numbers and stream orders versus mean stream lengths are found...

  13. Medical Isotope Production Analyses In KIPT Neutron Source Facility

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Yousry

    2016-01-01

    Medical isotope production analyses in Kharkov Institute of Physics and Technology (KIPT) neutron source facility were performed to include the details of the irradiation cassette and the self-shielding effect. An updated detailed model of the facility was used for the analyses. The facility consists of an accelerator-driven system (ADS), which has a subcritical assembly using low-enriched uranium fuel elements with a beryllium-graphite reflector. The beryllium assemblies of the reflector have the same outer geometry as the fuel elements, which permits loading the subcritical assembly with different number of fuel elements without impacting the reflector performance. The subcritical assembly is driven by an external neutron source generated from the interaction of 100-kW electron beam with a tungsten target. The facility construction was completed at the end of 2015, and it is planned to start the operation during the year of 2016. It is the first ADS in the world, which has a coolant system for removing the generated fission power. Argonne National Laboratory has developed the design concept and performed extensive design analyses for the facility including its utilization for the production of different radioactive medical isotopes. 99 Mo is the parent isotope of 99m Tc, which is the most commonly used medical radioactive isotope. Detailed analyses were performed to define the optimal sample irradiation location and the generated activity, for several radioactive medical isotopes, as a function of the irradiation time.

  14. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    van de Wal, R.S.W.; Meijer, H.A.J.; van Rooij, M.; van der Veen, C.

    2007-01-01

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for 14CO and 14CO2 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced 14CO fraction show a very low concentration of in situ produced 14CO.

  15. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    Van de Wal, R. S. W.; Meijer, H. A. J.; De Rooij, M.; Van der Veen, C.

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for (CO)-C-14 and (CO2)-C-14 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced (CO)-C-14 fraction show a very low concentration of in situ

  16. Weight analyses and nitrogen balance assay in rats fed extruded ...

    African Journals Online (AJOL)

    Weight analyses and nitrogen balance assay in adult rats in raw and extruded African breadfruit (Treculia africana) based diets were carried out using response surface methodology in a central composite design. Process variables were feed composition (40 - 100 % African breadfruit, 0 - 5 % corn and 0 - 55 % soybean, ...

  17. Medical Isotope Production Analyses In KIPT Neutron Source Facility

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto [Argonne National Lab. (ANL), Argonne, IL (United States); Gohar, Yousry [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    Medical isotope production analyses in Kharkov Institute of Physics and Technology (KIPT) neutron source facility were performed to include the details of the irradiation cassette and the self-shielding effect. An updated detailed model of the facility was used for the analyses. The facility consists of an accelerator-driven system (ADS), which has a subcritical assembly using low-enriched uranium fuel elements with a beryllium-graphite reflector. The beryllium assemblies of the reflector have the same outer geometry as the fuel elements, which permits loading the subcritical assembly with different number of fuel elements without impacting the reflector performance. The subcritical assembly is driven by an external neutron source generated from the interaction of 100-kW electron beam with a tungsten target. The facility construction was completed at the end of 2015, and it is planned to start the operation during the year of 2016. It is the first ADS in the world, which has a coolant system for removing the generated fission power. Argonne National Laboratory has developed the design concept and performed extensive design analyses for the facility including its utilization for the production of different radioactive medical isotopes. 99Mo is the parent isotope of 99mTc, which is the most commonly used medical radioactive isotope. Detailed analyses were performed to define the optimal sample irradiation location and the generated activity, for several radioactive medical isotopes, as a function of the irradiation time.

  18. Karyotype analyses of the species of the genus Jurinea Cass ...

    African Journals Online (AJOL)

    In this study, karyotype analyses of 13 species belonging to the genus Jurinea Cass. (Compositae) and grown naturally in Turkey were conducted. These taxa include Jurinea alpigena C. Koch, Jurinea ancyrensis Bornm., Jurinea aucherana DC., Jurinea cadmea Boiss., Jurinea cataonica Boiss. and Hausskn., Jurinea ...

  19. Consumer Brand Choice: Individual and Group Analyses of Demand Elasticity

    Science.gov (United States)

    Oliveira-Castro, Jorge M.; Foxall, Gordon R.; Schrezenmaier, Teresa C.

    2006-01-01

    Following the behavior-analytic tradition of analyzing individual behavior, the present research investigated demand elasticity of individual consumers purchasing supermarket products, and compared individual and group analyses of elasticity. Panel data from 80 UK consumers purchasing 9 product categories (i.e., baked beans, biscuits, breakfast…

  20. Global post-Kyoto scenario analyses at PSI

    International Nuclear Information System (INIS)

    Kypreos, S.

    1999-01-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs

  1. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  2. Review of HEDL fuel pin transient analyses analytical programs

    International Nuclear Information System (INIS)

    Scott, J.H.; Baars, R.E.

    1975-05-01

    Methods for analysis of transient fuel pin performance are described, as represented by the steady-state SIEX code and the PECT series of codes used for steady-state and transient mechanical analyses. The empirical fuel failure correlation currently in use for analysis of transient overpower accidents is described. (U.S.)

  3. Geospatial analyses in support of heavy metal contamination ...

    African Journals Online (AJOL)

    This paper presents an exploratory assessment of heavy metal contamination along the main highways in Mafikeng, and illustrates how spatial analyses of the contamination for environmental management purposes can be supported by GIS and Remote Sensing. Roadside soil and grass (Stenotaphrum sp.) samples were ...

  4. Physico-Chemical and Bacteriological Analyses of Water Used for ...

    African Journals Online (AJOL)

    Samuel Olaleye

    Physicochemical and bacteriological analyses were carried out on well water, stream water and river water used for drinking and swimming purposes in. Abeokuta, Nigeria. The results obtained were compared with WHO and EPA standards for drinking and recreational water. With the exception of Sokori stream and a well ...

  5. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    elements. The tool features incremental syntax checking and code generation which take place while a net is being constructed. A fast simulator efficiently handles both untimed and timed nets. Full and partial state spaces can be generated and analysed, and a standard state space report contains...

  6. Application of digital-image-correlation techniques in analysing ...

    Indian Academy of Sciences (India)

    Basis theory of strain analysis using the digital image correlation method .... Type 304N Stainless Steel (Modulus of Elasticity = 193 MPa, Tensile Yield .... also proves the accuracy of the qualitative analyses by using the DIC ... We thank the National Science Council of Taiwan for supporting this research through grant. No.

  7. Preparation of Kepler light curves for asteroseismic analyses

    NARCIS (Netherlands)

    García, R.A.; Hekker, S.; Stello, D.; Gutiérrez-Soto, J.; Handberg, R.; Huber, D.; Karoff, C.; Uytterhoeven, K.; Appourchaux, T.; Chaplin, W.J.; Elsworth, Y.; Mathur, S.; Ballot, J.; Christensen-Dalsgaard, J.; Gilliland, R.L.; Houdek, G.; Jenkins, J.M.; Kjeldsen, H.; McCauliff, S.; Metcalfe, T.; Middour, C.K.; Molenda-Zakowicz, J.; Monteiro, M.J.P.F.G.; Smith, J.C.; Thompson, M.J.

    2011-01-01

    The Kepler mission is providing photometric data of exquisite quality for the asteroseismic study of different classes of pulsating stars. These analyses place particular demands on the pre-processing of the data, over a range of time-scales from minutes to months. Here, we describe processing

  8. Environmental analyses of land transportation systems in The Netherlands

    NARCIS (Netherlands)

    Bouwman, Mirjan E.; Moll, Henri C.

    Environmental analyses of the impact of transportation systems on the environment from the cradle to the grave are rare. This article makes a comparison of various Dutch passenger transportation systems by studying their complete life-cycle energy use. Moreover, systems are compared according to

  9. "Analysing Genre: Language Use in Professional Settings." A Review.

    Science.gov (United States)

    Drury, Helen

    1995-01-01

    "Analysing Genre," by Vijay K. Bhatia, is a timely addition to the literature on genre analysis in English for specific purposes. It is divided into three parts: the first provides theoretical background; the second explains how genre analysis works in different academic and professional settings; and the third exemplifies the…

  10. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  11. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  12. Post-facta Analyses of Fukushima Accident and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Fumiya [Sociotechnical Systems Safety Research Institute, Ichige (Japan)

    2014-08-15

    Independent analyses have been performed of the core melt behavior of the Unit 1, Unit 2 and Unit 3 reactors of Fukushima Daiichi Nuclear Power Station on 11-15 March 2011. The analyses are based on a phenomenological methodology with measured data investigation and a simple physical model calculation. Estimated are time variation of core water level, core material temperature and hydrogen generation rate. The analyses have revealed characteristics of accident process of each reactor. In the case of Unit 2 reactor, the calculated result suggests little hydrogen generation because of no steam generation in the core for zirconium-steam reaction during fuel damage process. It could be the reason of no hydrogen explosion in the Unit 2 reactor building. Analyses have been performed also on the core material behavior in another chaotic period of 19-31 March 2011, and it resulted in a re-melt hypothesis that core material in each reactor should have melted again due to shortage of cooling water. The hypothesis is consistent with many observed features of radioactive materials dispersion into the environment.

  13. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  14. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  15. Safety analyses of the electrical systems on VVER NPP

    International Nuclear Information System (INIS)

    Andel, J.

    2004-01-01

    Energoprojekt Praha has been the main entity responsible for the section on 'Electrical Systems' in the safety reports of the Temelin, Dukovany and Mochovce nuclear power plants. The section comprises 2 main chapters, viz. Offsite Power System (issues of electrical energy production in main generators and the link to the offsite transmission grid) and Onsite Power Systems (AC and DC auxiliary system, both normal and safety related). In the chapter on the off-site system, attention is paid to the analysis of transmission capacity of the 400 kV lines, analysis of transient stability, multiple fault analyses, and probabilistic analyses of the grid and NPP power system reliability. In the chapter on the on-site system, attention is paid to the power balances of the electrical sources and switchboards set for various operational and accident modes, checks of loading and function of service and backup sources, short circuit current calculations, analyses of electrical protections, and analyses of the function and sizing of emergency sources (DG sets and UPS systems). (P.A.)

  16. A turbulent jet in crossflow analysed with proper orthogonal decomposition

    DEFF Research Database (Denmark)

    Meyer, Knud Erik; Pedersen, Jakob Martin; Özcan, Oktay

    2007-01-01

    and pipe diameter was 2400 and the jet to crossflow velocity ratios were R = 3.3 and R = 1.3. The experimental data have been analysed by proper orthogonal decomposition (POD). For R = 3.3, the results in several different planes indicate that the wake vortices are the dominant dynamic flow structures...

  17. Energy and exergy analyses of the diffusion absorption refrigeration system

    International Nuclear Information System (INIS)

    Yıldız, Abdullah; Ersöz, Mustafa Ali

    2013-01-01

    This paper describes the thermodynamic analyses of a DAR (diffusion absorption refrigeration) cycle. The experimental apparatus is set up to an ammonia–water DAR cycle with helium as the auxiliary inert gas. A thermodynamic model including mass, energy and exergy balance equations are presented for each component of the DAR cycle and this model is then validated by comparison with experimental data. In the thermodynamic analyses, energy and exergy losses for each component of the system are quantified and illustrated. The systems' energy and exergy losses and efficiencies are investigated. The highest energy and exergy losses occur in the solution heat exchanger. The highest energy losses in the experimental and theoretical analyses are found 25.7090 W and 25.4788 W respectively, whereas those losses as to exergy are calculated 13.7933 W and 13.9976 W. Although the values of energy efficiencies obtained from both the model and experimental studies are calculated as 0.1858, those values, in terms of exergy efficiencies are found 0.0260 and 0.0356. - Highlights: • The diffusion absorption refrigerator system is designed manufactured and tested. • The energy and exergy analyses of the system are presented theoretically and experimentally. • The energy and exergy losses are investigated for each component of the system. • The highest energy and exergy losses occur in the solution heat exchanger. • The energy and the exergy performances are also calculated

  18. Matrix Summaries Improve Research Reports: Secondary Analyses Using Published Literature

    Science.gov (United States)

    Zientek, Linda Reichwein; Thompson, Bruce

    2009-01-01

    Correlation matrices and standard deviations are the building blocks of many of the commonly conducted analyses in published research, and AERA and APA reporting standards recommend their inclusion when reporting research results. The authors argue that the inclusion of correlation/covariance matrices, standard deviations, and means can enhance…

  19. Genetic analyses for deciphering the status and role of ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 96; Issue 1. Genetic analyses for deciphering the status and role of photoperiodic and maturity genes in major Indian soybean cultivars. SANJAY GUPTA VIRENDER SINGH BHATIA GIRIRAJ KUMAWAT DEVSHREE THAKUR GOURAV SINGH RACHANA TRIPATHI GYANESH ...

  20. A new Link for Geographic analyses of Inventory Data

    Science.gov (United States)

    David Reed; Kurt Pregitzer; Scott A. Pugh; Patrick D. Miles

    2001-01-01

    The USDA Forest Service Forest Inventory and Analysis (FIA)data are widely used throughout the United States for analyses of forest status and trends, landscape-level forest composition, and other forest characteristics. A new software product, FIAMODEL, is available for analyzing FIA data within the ArcView? (ESRI, Inc.)geographic information system. The software...

  1. Analyse cognitive d'une politique publique : justice ...

    African Journals Online (AJOL)

    Analyse cognitive d'une politique publique : justice environnementale et « marchés ruraux » de bois-énergie. ... energy sources to poor urban dwellers; and to reduce the poverty of rural households by promoting sustainable forest management including income generation through producing and marketing charcoal.

  2. Analysing Harmonic Motions with an iPhone's Magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Temiz, Burak Kagan

    2016-01-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone's (or iPad's) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone's magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone's screen using the "Sensor Kinetics"…

  3. A process mining approach to analyse user behaviour

    NARCIS (Netherlands)

    Maruster, Laura; Faber, Niels R.; Jorna, Rene J.; van Haren, Rob J. F.; Cordeiro, J; Filipe, J; Hammoudi, S

    2008-01-01

    Designing and personalising systems for specific user groups encompasses a lot of effort with respect to analysing and understanding user behaviour. The goal of our paper is to provide a new methodology for determining navigational patterns of behaviour of specific user groups. We consider

  4. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  5. Radiation physics and shielding codes and analyses applied to design-assist and safety analyses of CANDUR and ACRTM reactors

    International Nuclear Information System (INIS)

    Aydogdu, K.; Boss, C. R.

    2006-01-01

    This paper discusses the radiation physics and shielding codes and analyses applied in the design of CANDU and ACR reactors. The focus is on the types of analyses undertaken rather than the inputs supplied to the engineering disciplines. Nevertheless, the discussion does show how these analyses contribute to the engineering design. Analyses in radiation physics and shielding can be categorized as either design-assist or safety and licensing (accident) analyses. Many of the analyses undertaken are designated 'design-assist' where the analyses are used to generate recommendations that directly influence plant design. These recommendations are directed at mitigating or reducing the radiation hazard of the nuclear power plant with engineered systems and components. Thus the analyses serve a primary safety function by ensuring the plant can be operated with acceptable radiation hazards to the workers and public. In addition to this role of design assist, radiation physics and shielding codes are also deployed in safety and licensing assessments of the consequences of radioactive releases of gaseous and liquid effluents during normal operation and gaseous effluents following accidents. In the latter category, the final consequences of accident sequences, expressed in terms of radiation dose to members of the public, and inputs to accident analysis, e.g., decay heat in fuel following a loss-of-coolant accident, are also calculated. Another role of the analyses is to demonstrate that the design of the plant satisfies the principle of ALARA (as low as reasonably achievable) radiation doses. This principle is applied throughout the design process to minimize worker and public doses. The principle of ALARA is an inherent part of all design-assist recommendations and safety and licensing assessments. The main focus of an ALARA exercise at the design stage is to minimize the radiation hazards at the source. This exploits material selection and impurity specifications and relies

  6. Pollen analyses of Pleistocene hyaena coprolites from Montenegro and Serbia

    Directory of Open Access Journals (Sweden)

    Argant Jacqueline

    2007-01-01

    Full Text Available The results of pollen analyses of hyaena coprolites from the Early Pleistocene cave of Trlica in northern Montenegro and the Late Pleistocene cave of Baranica in southeast Serbia are described. The Early Pleistocene Pachycrocuta brevirostris, and the Late Pleistocene Crocuta spelaea are coprolite-producing species. Although the pollen concentration was rather low, the presented analyses add considerably to the much-needed knowledge of the vegetation of the central Balkans during the Pleistocene. Pollen extracted from a coprolite from the Baranica cave indicates an open landscape with the presence of steppe taxa, which is in accordance with the recorded conditions and faunal remains. Pollen analysis of the Early Pleistocene samples from Trlica indicate fresh and temperate humid climatic conditions, as well as the co-existence of several biotopes which formed a mosaic landscape in the vicinity of the cave.

  7. Criticality safety analyses in SKODA JS a.s

    International Nuclear Information System (INIS)

    Mikolas, P.; Svarny, J.

    1999-01-01

    This paper describes criticality safety analyses of spent fuel systems for storage and transport of spent fuel performed in SKODA JS s.r.o.. Analyses were performed for different systems both at NPP site including originally designed spent fuel pool with a large pitch between assemblies without any special absorbing material, high density spent fuel pool with an additional absorption by boron steel, depository rack for fresh fuel assemblies with a very large pitch between fuel assemblies, a container for transport of fresh fuel into the reactor pool and a cask for transport and storage of spent fuel and container for final storage depository. required subcriticality has been proven taking into account all possible unfavourable conditions, uncertainties etc. In two cases, burnup credit methodology is expected to be used. (Authors)

  8. Iterative categorization (IC): a systematic technique for analysing qualitative data

    Science.gov (United States)

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  9. Multivariate analyses of crater parameters and the classification of craters

    Science.gov (United States)

    Siegal, B. S.; Griffiths, J. C.

    1974-01-01

    Multivariate analyses were performed on certain linear dimensions of six genetic types of craters. A total of 320 craters, consisting of laboratory fluidization craters, craters formed by chemical and nuclear explosives, terrestrial maars and other volcanic craters, and terrestrial meteorite impact craters, authenticated and probable, were analyzed in the first data set in terms of their mean rim crest diameter, mean interior relief, rim height, and mean exterior rim width. The second data set contained an additional 91 terrestrial craters of which 19 were of experimental percussive impact and 28 of volcanic collapse origin, and which was analyzed in terms of mean rim crest diameter, mean interior relief, and rim height. Principal component analyses were performed on the six genetic types of craters. Ninety per cent of the variation in the variables can be accounted for by two components. Ninety-nine per cent of the variation in the craters formed by chemical and nuclear explosives is explained by the first component alone.

  10. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  11. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  12. Contribution of thermo-fluid analyses to the LHC experiments

    CERN Document Server

    Gasser, G

    2003-01-01

    The big amount of electrical and electronic equipment that will be installed in the four LHC experiments will cause important heat dissipation into the detectors’ volumes. This is a major issue for the experimental groups, as temperature stability is often a fundamental requirement for the different sub-detectors to be able to provide a good measurement quality. The thermofluid analyses that are carried out in the ST/CV group are a very efficient tool to understand and predict the thermal behaviour of the detectors. These studies are undertaken according to the needs of the experimental groups; they aim at evaluate the thermal stability for a proposed design, or to compare different technical solutions in order to choose the best one for the final design. The usual approach to carry out these studies is first presented and then, some practical examples of thermo-fluid analyses are presented focusing on the main results in order to illustrate their contribution.

  13. Numerical analyses of an aircraft crash on containment building

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Jae Min; Kim, Seung Hyun; Chang, Yoon Suk [Kyunghee University, Yongin (Korea, Republic of)

    2016-05-15

    The containment building is responsible to isolate and protect internal devices against external conditions like earthquake, hurricane and impact loading. It has also to protect leakage of radioactivity, like LOCA (Loss Of Coolant Accident), when severe accidents occurred. Meanwhile, social awareness such as terrorism has been increased globally after international aircraft crashes at World Trade Center and Pentagon. In this paper, FE (Finite Element) analyses according to variation of crash locations and speeds were carried out to examine the aircraft crash impact on a domestic containment building. In this paper, numerical analyses of aircraft crash on NPP's containment building were performed taking into account different locations and aircraft speeds. (1) Amounts of concrete failure were dependent on the crash locations and the connector was the most delicate location comparing to the dome and wall part. (2) Maximum stress values generated at the liner plate and rebars did not exceed their UTS values.

  14. A database structure for radiological optimization analyses of decommissioning operations

    International Nuclear Information System (INIS)

    Zeevaert, T.; Van de Walle, B.

    1995-09-01

    The structure of a database for decommissioning experiences is described. Radiological optimization is a major radiation protection principle in practices and interventions, involving radiological protection factors, economic costs, social factors. An important lack of knowledge with respect to these factors exists in the domain of the decommissioning of nuclear power plants, due to the low number of decommissioning operations already performed. Moreover, decommissioning takes place only once for a installation. Tasks, techniques, and procedures are in most cases rather specific, limiting the use of past experiences in the radiological optimization analyses of new decommissioning operations. Therefore, it is important that relevant data or information be acquired from decommissioning experiences. These data have to be stored in a database in a way they can be used efficiently in ALARA analyses of future decommissioning activities

  15. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    Science.gov (United States)

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  16. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  17. Thermal analyses of the IF-300 shipping cask

    International Nuclear Information System (INIS)

    Meier, J.K.

    1978-07-01

    In order to supply temperature data for structural testing and analysis of shipping casks, a series of thermal analyses using the TRUMP thermal analyzer program were performed on the GE IF-300 spent fuel shipping cask. Major conclusions of the analyses are: (1) Under normal cooling conditions and a cask heat load of 262,000 BTU/h, the seal area of the cask will be roughly 100 0 C (180 0 F) above the ambient surroundings. (2) Under these same conditions the uranium shield at the midpoint of the cask will be between 69 0 C (125 0 F) and 92 0 C (166 0 F) above the ambient surroundings. (3) Significant thermal gradients are not likely to develop between the head studs and the surrounding metal. (4) A representative time constant for the cask as a whole is on the order of one day

  18. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between ne...... information such as boundedness properties and liveness properties. The functionality of the simulation engine and state space facilities are similar to the corresponding components in Design/CPN, which is a widespread tool for Coloured Petri Nets.......CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  19. Applications of MIDAS regression in analysing trends in water quality

    Science.gov (United States)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  20. Elemental abundance and analyses with coadded DAO spectrograms

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1987-01-01

    One can improve the quality of elemental abundance analyses by using higher signal-to-noise data than has been the practice at high resolution. The procedures developed at the Dominion Astrophysical Observatory to coadd high-dispersion coude spectrograms are used with a minimum of 10 6.5 A mm -1 IIa-O spectrograms of each of three field horizontal-branch (FHB)A stars to increase the signal-to-noise ratio of the photographic data over a considerable wavelength region. Fine analyses of the sharp-lined prototype FHB stars HD 109995 and 161817 show an internal consistency which justifies this effort. Their photospheric elemental abundances are similar to those of Population II globular cluster giants. (author)

  1. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  2. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  3. SVM models for analysing the headstreams of mine water inrush

    Energy Technology Data Exchange (ETDEWEB)

    Yan Zhi-gang; Du Pei-jun; Guo Da-zhi [China University of Science and Technology, Xuzhou (China). School of Environmental Science and Spatial Informatics

    2007-08-15

    The support vector machine (SVM) model was introduced to analyse the headstrean of water inrush in a coal mine. The SVM model, based on a hydrogeochemical method, was constructed for recognising two kinds of headstreams and the H-SVMs model was constructed for recognising multi- headstreams. The SVM method was applied to analyse the conditions of two mixed headstreams and the value of the SVM decision function was investigated as a means of denoting the hydrogeochemical abnormality. The experimental results show that the SVM is based on a strict mathematical theory, has a simple structure and a good overall performance. Moreover the parameter W in the decision function can describe the weights of discrimination indices of the headstream of water inrush. The value of the decision function can denote hydrogeochemistry abnormality, which is significant in the prevention of water inrush in a coal mine. 9 refs., 1 fig., 7 tabs.

  4. Physiological and enzymatic analyses of pineapple subjected to ionizing radiation

    International Nuclear Information System (INIS)

    Silva, Josenilda Maria da; Silva, Juliana Pizarro; Spoto, Marta Helena Fillet

    2007-01-01

    The physiological and enzymatic post-harvest characteristics of the pineapple cultivar Smooth Cayenne were evaluated after the fruits were gamma-irradiated with doses of 100 and 150 Gy and the fruits were stored for 10, 20 and 30 days at 12 deg C (±1) and relative humidity of 85% (±5). Physiological and enzymatic analyses were made for each storage period to evaluate the alterations resulting from the application of ionizing radiation. Control specimens showed higher values of soluble pectins, total pectins, reducing sugars, sucrose and total sugars and lower values of polyphenyloxidase and polygalacturonase enzyme activities. All the analyses indicated that storage time is a significantly influencing factor. The 100 Gy dosage and 20-day storage period presented the best results from the standpoint of maturation and conservation of the fruits quality. (author)

  5. Engineering analyses of ITER divertor diagnostic rack design

    Energy Technology Data Exchange (ETDEWEB)

    Modestov, Victor S., E-mail: modestov@compmechlab.com [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Nemov, Alexander S.; Borovkov, Aleksey I.; Buslakov, Igor V.; Lukin, Aleksey V. [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Kochergin, Mikhail M.; Mukhin, Eugene E.; Litvinov, Andrey E.; Koval, Alexandr N. [Ioffe Physico-Technical Institute, 194021 St Petersburg, 26 Polytechnicheskaya (Russian Federation); Andrew, Philip [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: • The approach developed early has been used for the assessment of new design of DTS racks and neutron shield units. • Results of most critical EM and seismic analyses indicate that introduced changes significantly improved the system behaviour under these loads. • However further research is required to finalize the design and check it upon meeting all structural, thermal, seismic, EM and fatigue requirements. -- Abstract: The divertor port racks used as a support structure of the divertor Thomson scattering equipment has been carefully analyzed to be consistent with electromagnetic and seismic loads. It follows from the foregoing simulations that namely these analyses demonstrate critical challenges associated with the structure design. Based on the results of the reference structure [2] a modified design of the diagnostic racks is proposed and updated simulation results are given. The results signify a significant improvement over the previous reference layout and the design will be continued towards finalization.

  6. A MULTIVARIATE APPROACH TO ANALYSE NATIVE FOREST TREE SPECIE SEEDS

    Directory of Open Access Journals (Sweden)

    Alessandro Dal Col Lúcio

    2006-03-01

    Full Text Available This work grouped, by species, the most similar seed tree, using the variables observed in exotic forest species of theBrazilian flora of seeds collected in the Forest Research and Soil Conservation Center of Santa Maria, Rio Grande do Sul, analyzedfrom January, 1997, to march, 2003. For the cluster analysis, all the species that possessed four or more analyses per lot wereanalyzed by the hierarchical Clustering method, of the standardized Euclidian medium distance, being also a principal componentanalysis technique for reducing the number of variables. The species Callistemon speciosus, Cassia fistula, Eucalyptus grandis,Eucalyptus robusta, Eucalyptus saligna, Eucalyptus tereticornis, Delonix regia, Jacaranda mimosaefolia e Pinus elliottii presentedmore than four analyses per lot, in which the third and fourth main components explained 80% of the total variation. The clusteranalysis was efficient in the separation of the groups of all tested species, as well as the method of the main components.

  7. Lipid analyses of fumigated vs irradiated raw and roasted almonds

    International Nuclear Information System (INIS)

    Uthman, R.S.; Toma, R.B.; Garcia, R.; Medora, N.P.; Cunningham, S.

    1998-01-01

    The purpose of this study was to compare the effects of propylene oxide (PO) and irradiation treatments on the lipid analyses of raw and roasted almonds. Eight kilograms each of raw and roasted almonds were divided into four batches (2 kg each). Three of the batches were subjected to PO treatment or irradiation treatment with a dose of 6, 10·5 kGy. The untreated batch served as control samples, they were taken from all the batches at three consecutive times during storage (day 0, 8 weeks and 16 weeks) and analysed for iodine number, peroxide value and 2-thiobarbituric acid number. Overall, irradiated almonds incurred a higher variation in lipid stability than PO tested almonds while roasted almonds incurred a higher variation than raw almonds

  8. Neoliberalism in education: Five images of critical analyses

    Directory of Open Access Journals (Sweden)

    Branislav Pupala

    2011-03-01

    Full Text Available The survey study brings information about the way that educational researchcopes with neoliberalism as a generalized form of social government in the currentwestern culture. It shows that neoliberalism is considered as a universal scope of otherchanges in the basic segments of education and those theoretical and critical analyses ofthis phenomenon represent an important part of production in the area of educationalresearch. It emphasizes the contribution of formation and development of the socalledgovernmental studies for comprehension of mechanisms and consequences ofneoliberal government of the society and shows how way the methodology of thesestudies helps to identify neoliberal strategies used in the regulation of social subjectsby education. There are five selected segments of critical analyses elaborated (fromthe concept of a lifelong learning, through preschool and university education to theeducation of teachers and PISA project that obviously show ideological and theoreticalcohesiveness of the education analysis through the scope of neoliberal governmentality.

  9. Energy and exergy analyses of electrolytic hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnic Univ., Toronto, ON (Canada). Dept. of Mechanical Engineering

    1995-07-01

    The thermodynamic performance is investigated of a water-electrolysis process for producing hydrogen, based on current-technology equipment. Both energy and exergy analyses are used. Three cases are considered in which the principal driving energy inputs are (i) electricity, (ii) the high-temperature heat used to generate the electricity, and (iii) the heat source used to produce the high-temperature heat. The nature of the heat source (e.g.) fossil fuel, nuclear fuel, solar energy, (etc.) is left as general as possible. The analyses indicate that, when the main driving input is the hypothetical heat source, the principal thermodynamic losses are associated with water splitting, electricity generation and heat production; the losses are mainly due to the irreversibilities associated with converting a heat source to heat, and heat transfer across large temperature differences. The losses associated with the waste heat in used cooling water, because of its low quality, are not as significant as energy analysis indicates. (Author)

  10. Phylogenomic analyses data of the avian phylogenomics project

    DEFF Research Database (Denmark)

    Jarvis, Erich D; Mirarab, Siavash; Aberer, Andre J

    2015-01-01

    BACKGROUND: Determining the evolutionary relationships among the major lineages of extant birds has been one of the biggest challenges in systematic biology. To address this challenge, we assembled or collected the genomes of 48 avian species spanning most orders of birds, including all Neognathae...... and two of the five Palaeognathae orders. We used these genomes to construct a genome-scale avian phylogenetic tree and perform comparative genomic analyses. FINDINGS: Here we present the datasets associated with the phylogenomic analyses, which include sequence alignment files consisting of nucleotides......ML algorithm or when using statistical binning with the coalescence-based MP-EST algorithm (which we refer to as MP-EST*). Other data sets, such as the coding sequence of some exons, revealed other properties of genome evolution, namely convergence. CONCLUSIONS: The Avian Phylogenomics Project is the largest...

  11. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  12. Numerical analyses of an aircraft crash on containment building

    International Nuclear Information System (INIS)

    Sim, Jae Min; Kim, Seung Hyun; Chang, Yoon Suk

    2016-01-01

    The containment building is responsible to isolate and protect internal devices against external conditions like earthquake, hurricane and impact loading. It has also to protect leakage of radioactivity, like LOCA (Loss Of Coolant Accident), when severe accidents occurred. Meanwhile, social awareness such as terrorism has been increased globally after international aircraft crashes at World Trade Center and Pentagon. In this paper, FE (Finite Element) analyses according to variation of crash locations and speeds were carried out to examine the aircraft crash impact on a domestic containment building. In this paper, numerical analyses of aircraft crash on NPP's containment building were performed taking into account different locations and aircraft speeds. (1) Amounts of concrete failure were dependent on the crash locations and the connector was the most delicate location comparing to the dome and wall part. (2) Maximum stress values generated at the liner plate and rebars did not exceed their UTS values

  13. Analysing Trust Transitivity and The Effects of Unknown Dependence

    Directory of Open Access Journals (Sweden)

    Touhid Bhuiyan

    2010-03-01

    Full Text Available Trust can be used to improve online automated recommendation within a given domain. Trust transitivity is used to make it successful. But trust transitivity has different interpretations. Trust and trust transitivity; both are the human mental phenomenon and for this reason, there is no such thing as objective transitivity. Trust transitivity and trust fusion both are important elements in computational trust. This paper analyses the parameter dependence problem in trust transitivity and proposes some definitions considering the effects of base rate. In addition, it also proposes belief functions based on subjective logic to analyse trust transitivity of three specified cases with sensitive and insensitive based rate. Then it presents a quantitative analysis of the effects of unknown dependence problem in an interconnected network environment; such Internet.

  14. Report of analyses for light hydrocarbons in ground water

    International Nuclear Information System (INIS)

    Dromgoole, E.L.

    1982-04-01

    This report contains on microfiche the results of analyses for methane, ethane, propane, and butane in 11,659 ground water samples collected in 47 western and three eastern 1 0 x 2 0 quadrangles of the National Topographic Map Series (Figures 1 and 2), along with a brief description of the analytical technique used and some simple, descriptive statistics. The ground water samples were collected as part of the National Uranium Resource Evaluation (NURE) hydrogeochemical and stream sediment reconnaissance. Further information on the ground water samples can be obtained by consulting the NURE data reports for the individual quadrangles. This information includes (1) measurements characterizing water samples (pH, conductivity, and alkalinity), (2) physical measurements, where applicable (water temperature, well description, and other measurements), and (3) elemental analyses

  15. ATWS analyses for Krsko Full Scope Simulator verification

    Energy Technology Data Exchange (ETDEWEB)

    Cerne, G; Tiselj, I; Parzer, I [Reactor Engineering Div., Inst. Jozef Stefan, Ljubljana (Slovenia)

    2000-07-01

    The purpose of this analysis was to simulate Anticipated Transient without Scram transient for Krsko NPP. The results of these calculations were used for verification of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. For the thermal-hydraulic analyses the RELAP5/MOD2 code and the input card deck for NPP Krsko was used. The analyses for ATWS were performed to assess the influence and benefit of ATWS Mitigation System Actuation Circuitry (AMSAC). In the presented paper the most severe ATWS scenarios have been analyzed, starting with the loss of Main Feedwater at both steam generators. Thus, gradual loss of secondary heat sink occurred. On top of that, control rods were not supposed to scram, leaving the chain reaction to be controlled only by inherent physical properties of the fuel and moderator and eventual actions of the BOP system. The primary system response has been studied regarding the AMSAC availability. (author)

  16. The moral economy of austerity: analysing UK welfare reform.

    Science.gov (United States)

    Morris, Lydia

    2016-03-01

    This paper notes the contemporary emergence of 'morality' in both sociological argument and political rhetoric, and analyses its significance in relation to ongoing UK welfare reforms. It revisits the idea of 'moral economy' and identifies two strands in its contemporary application; that all economies depend on an internal moral schema, and that some external moral evaluation is desirable. UK welfare reform is analysed as an example of the former, with reference to three distinct orientations advanced in the work of Freeden (1996), Laclau (2014), and Lockwood (1996). In this light, the paper then considers challenges to the reform agenda, drawn from third sector and other public sources. It outlines the forms of argument present in these challenges, based respectively on rationality, legality, and morality, which together provide a basis for evaluation of the welfare reforms and for an alternative 'moral economy'. © London School of Economics and Political Science 2016.

  17. Å speide etter spiritualitet. En analyse av spiritualitetsbegrepet i speiderbevegelsen

    OpenAIRE

    Holmefjord, Aina

    2015-01-01

    Denne masteroppgaven inneholder analyser av speiderbevegelsens bruk av begrepet "spiritualitet" i to bøker skrevet av bevegelsens grunnlegger; "Scouting for Boys" og "Rovering to Succes" og to dokumenter av The World Organization of he Scoutmovement . Robert Baden-Powell grunnlag speiderbevegelsen i 1908 og hans litteratur og bøker publisert på tidlig 1900-tallet setter rammeverk for mye av dagens speiderbevegelses ideologi og visjon. Speiderbevegelsen har et r...

  18. GIS baseret analyse af landskabsændringer

    DEFF Research Database (Denmark)

    Kristensen, Søren Bech Pilgaard

    2009-01-01

    af topografiske kort i en GIS analyse er det muligt at udpege de arealer som har været stabile i mere end 100 år og som derfor potentielt rummer store naturværdier. De seneste 150 år er der sket store ændringer i det danske landskab. Mange ekstensive arealtyper (enge, overdrev, heder, etc.) er gået...

  19. Prenominal and postnominal reduced relative clauses: arguments against unitary analyses

    Directory of Open Access Journals (Sweden)

    Petra Sleeman

    2007-01-01

    Full Text Available These last years, several analyses have been proposed in which prenominal and postnominal reduced relatives are merged in the same position. Kayne (1994 claims that both types of reduced relative clauses are the complement of the determiner. More recently, Cinque (2005 has proposed that both types are merged in the functional projections of the noun, at the left edge of the modifier system. In this paper, I argue against a unitary analysis of prenominal and postnominal participial reduced relatives.

  20. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....