WorldWideScience

Sample records for generalised source term

  1. Spray irrigation of landfill leachate: estimating potential exposures to workers and bystanders using a modified air box model and generalised source term

    International Nuclear Information System (INIS)

    Gray, Duncan; Pollard, Simon J.T.; Spence, Lynn; Smith, Richard; Gronow, Jan R.

    2005-01-01

    Generalised source term data from UK leachates and a probabilistic exposure model (BPRISC 4 ) were used to evaluate key routes of exposure from chemicals of concern during the spraying irrigation of landfill leachate. Risk estimates secured using a modified air box model are reported for a hypothetical worker exposed to selected chemicals within a generalised conceptual exposure model of spray irrigation. Consistent with pesticide spray exposure studies, the key risk driver is dermal exposure to the more toxic components of leachate. Changes in spray droplet diameter (0.02-0.2 cm) and in spray flow rate (50-1000 l/min) have little influence on dermal exposure, although the lesser routes of aerosol ingestion and inhalation are markedly affected. The risk estimates modelled using this conservative worst case exposure scenario are not of sufficient magnitude to warrant major concerns about chemical risks to workers or bystanders from this practice in the general sense. However, the modelling made use of generic concentration data for only a limited number of potential landfill leachate contaminants, such that individual practices may require assessment on the basis of their own merits. - Modelling approaches are used to assess human exposure routes to chemicals during spray irrigation of landfill leachates

  2. Spray irrigation of landfill leachate: estimating potential exposures to workers and bystanders using a modified air box model and generalised source term.

    Science.gov (United States)

    Gray, Duncan; Pollard, Simon J T; Spence, Lynn; Smith, Richard; Gronow, Jan R

    2005-02-01

    Generalised source term data from UK leachates and a probabilistic exposure model (BPRISC(4)) were used to evaluate key routes of exposure from chemicals of concern during the spraying irrigation of landfill leachate. Risk estimates secured using a modified air box model are reported for a hypothetical worker exposed to selected chemicals within a generalised conceptual exposure model of spray irrigation. Consistent with pesticide spray exposure studies, the key risk driver is dermal exposure to the more toxic components of leachate. Changes in spray droplet diameter (0.02-0.2 cm) and in spray flow rate (50-1000 l/min) have little influence on dermal exposure, although the lesser routes of aerosol ingestion and inhalation are markedly affected. The risk estimates modelled using this conservative worst case exposure scenario are not of sufficient magnitude to warrant major concerns about chemical risks to workers or bystanders from this practice in the general sense. However, the modelling made use of generic concentration data for only a limited number of potential landfill leachate contaminants, such that individual practices may require assessment on the basis of their own merits.

  3. Generalised boundary terms for higher derivative theories of gravity

    Energy Technology Data Exchange (ETDEWEB)

    Teimouri, Ali; Talaganis, Spyridon; Edholm, James [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Mazumdar, Anupam [Consortium for Fundamental Physics, Lancaster University,North West Drive, Lancaster, LA1 4YB (United Kingdom); Kapteyn Astronomical Institute, University of Groningen,9700 AV Groningen (Netherlands)

    2016-08-24

    In this paper we wish to find the corresponding Gibbons-Hawking-York term for the most general quadratic in curvature gravity by using Coframe slicing within the Arnowitt-Deser-Misner (ADM) decomposition of spacetime in four dimensions. In order to make sure that the higher derivative gravity is ghost and tachyon free at a perturbative level, one requires infinite covariant derivatives, which yields a generalised covariant infinite derivative theory of gravity. We will be exploring the boundary term for such a covariant infinite derivative theory of gravity.

  4. Deformation due to mechanical and thermal sources in generalised ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    dimensional problem of thermoelasticity has been considered to investigate the disturbance due to mechanical (horizontal or verti- cal) and thermal source in a homogeneous, thermally conducting orthorhombic material. Laplace–Fourier ...

  5. Generalised boundary terms for higher derivative theories of gravity

    NARCIS (Netherlands)

    Teimouri, Ali; Talaganis, Spyridon; Edholm, James; Mazumdar, Anupam

    2016-01-01

    In this paper we wish to find the corresponding Gibbons-Hawking-York term for the most general quadratic in curvature gravity by using Coframe slicing within the Arnowitt-Deser-Misner (ADM) decomposition of spacetime in four dimensions. In order to make sure that the higher derivative gravity is

  6. Successful short-term re-learning and generalisation of concepts in semantic dementia.

    Science.gov (United States)

    Suárez-González, Aida; Savage, Sharon A; Caine, Diana

    2016-09-28

    Patients with semantic dementia (SD) can rapidly and successfully re-learn word labels during cognitive intervention. This new learning, however, usually remains rigid and context-dependent. Conceptual enrichment (COEN) training is a therapy approach aimed to produce more flexible and generalisable learning in SD. In this study we compare generalisation and maintenance of learning after COEN with performance achieved using a classical naming therapy (NT). The study recruited a 62-year-old woman with SD. An AB 1 ACAB 2 experimental design was implemented, with naming performance assessed at baseline, post- intervention, 3 and 6 weeks after the end of each treatment phase. Three generalisation tasks were also assessed pre- and post-intervention. Naming post-intervention improved significantly following both therapies, however, words trained using COEN therapy showed a significantly greater degree of generalisation than those trained under NT. In addition, only words trained with COEN continued to show significant improvements compared with baseline performance when assessed 6 weeks after practice ceased. It was concluded that therapies based on conceptual enrichment of the semantic network facilitate relearning of words and enhance generalisation in patients with SD.

  7. Generalised Predictive Control of a 12-bus Network Using Neutral-Point Clamped Voltage Source Converter UPFC

    OpenAIRE

    Zhang, L; Kokkinakis, M; Chong, BVP

    2014-01-01

    The paper presents the application of a UPFC to a case study of a 12-bus high power network. The UPFC shunt converter employs 8 3-level Neutral Point Clamped (NPC) voltage source converters (VSC) and 12 single-phase three-winding phase shifting transformers (PST), generating a 48-pulse output voltage. The 3-phase H-bridge series converter shares the same dc-link with the shunt one. The novel feature of this work lies in the use of a model-based generalised predictive current control law to th...

  8. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  9. Short-term memory treatment: patterns of learning and generalisation to sentence comprehension in a person with aphasia.

    Science.gov (United States)

    Salis, Christos

    2012-01-01

    Auditory-verbal short-term memory deficits (STM) are prevalent in aphasia and can contribute to sentence comprehension deficits. This study investigated the effectiveness of a novel STM treatment in improving STM (measured with span tasks) and sentence comprehension (measured with the Token Test and the Test for the Reception of Grammar, TROG) in a person with severe aphasia (transcortical motor). In particular, the research questions were: (1) Would STM training improve STM? (2) Would improvements from the STM training generalise to improvements in comprehension of sentences? STM was trained using listening span tasks of serial word recognition. No other language or sentence comprehension skills were trained. Following treatment, STM abilities improved (listening span, forward digit span). There was also evidence of generalisation to untreated sentence comprehension (only on the TROG). Backward digit span, phonological processing and single word comprehension did not improve. Improvements in sentence comprehension may have resulted from resilience to rapid decay of linguistic representations within sentences (words and phrases). This in turn facilitated comprehension.

  10. Generalised teleparallel quintom dark energy non-minimally coupled with the scalar torsion and a boundary term

    Science.gov (United States)

    Bahamonde, Sebastian; Marciu, Mihai; Rudra, Prabir

    2018-04-01

    Within this work, we propose a new generalised quintom dark energy model in the teleparallel alternative of general relativity theory, by considering a non-minimal coupling between the scalar fields of a quintom model with the scalar torsion component T and the boundary term B. In the teleparallel alternative of general relativity theory, the boundary term represents the divergence of the torsion vector, B=2∇μTμ, and is related to the Ricci scalar R and the torsion scalar T, by the fundamental relation: R=‑T+B. We have investigated the dynamical properties of the present quintom scenario in the teleparallel alternative of general relativity theory by performing a dynamical system analysis in the case of decomposable exponential potentials. The study analysed the structure of the phase space, revealing the fundamental dynamical effects of the scalar torsion and boundary couplings in the case of a more general quintom scenario. Additionally, a numerical approach to the model is presented to analyse the cosmological evolution of the system.

  11. Generalised anxiety disorder

    OpenAIRE

    Gale, Christopher K; Millichamp, Jane

    2011-01-01

    Generalised anxiety disorder is characterised by persistent, excessive and difficult-to-control worry, which may be accompanied by several psychic and somatic symptoms, including suicidality. Generalized anxiety disorder is the most common psychiatric disorder in the primary care, although it is often underrecognised and undertreated. Generalized anxiety disorder is typically a chronic condition with low short- and medium-term remission rates. Clinical presentations often include depression, ...

  12. Procedures: Source Term Measurement Program

    International Nuclear Information System (INIS)

    Dyer, N.C.; Keller, J.H.; Nieschmidt, E.B.; Motes, B.J.

    1977-10-01

    The report contains procedures for the Source Term Measurement Project being performed by Idaho National Engineering Laboratory for the Nuclear Regulatory Commission. This work is being conducted for the Office of Nuclear Regulatory Research in support of requirements of the Effluent Treatment Systems Branch of the Office of Nuclear Reactor Regulation. This project is designed to obtain source term information at operating light water reactors to update the parameters used in NRC calculational models (GALE codes). Detailed procedures and methods used for collection and analysis of samples are presented. This provides a reference base to supplement a series of reports to be issued by the Source Term Measurements Project which will present data obtained from measurements in specific nuclear power stations. Reference to appropriate parts of these procedures will be made as required

  13. HTGR Mechanistic Source Terms White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Wayne Moe

    2010-07-01

    The primary purposes of this white paper are: (1) to describe the proposed approach for developing event specific mechanistic source terms for HTGR design and licensing, (2) to describe the technology development programs required to validate the design methods used to predict these mechanistic source terms and (3) to obtain agreement from the NRC that, subject to appropriate validation through the technology development program, the approach for developing event specific mechanistic source terms is acceptable

  14. Generalised twisted partition functions

    CERN Document Server

    Petkova, V B

    2001-01-01

    We consider the set of partition functions that result from the insertion of twist operators compatible with conformal invariance in a given 2D Conformal Field Theory (CFT). A consistency equation, which gives a classification of twists, is written and solved in particular cases. This generalises old results on twisted torus boundary conditions, gives a physical interpretation of Ocneanu's algebraic construction, and might offer a new route to the study of properties of CFT.

  15. A simplified calculation approach for source term

    International Nuclear Information System (INIS)

    Zhiyi Hu; Ning Zhan; Hanming Xu

    1994-01-01

    A simplified method is developed to estimate radioactive source term released to environment under severe accident of Nuclear Power Plants. In the method, progress of selected accident sequences is divided into several stages and the release characteristics in each stage are determined by detailed source term code package calculation. Source term for other accident sequences can be estimated by combination of the calculated results. The simplified approach has been tentatively applied for Nuclear Power Plants in China and the results are satisfied. 2 refs., 3 tabs

  16. Coherence generalises duality

    DEFF Research Database (Denmark)

    Carbone, Marco; Lindley, Sam; Montesi, Fabrizio

    2016-01-01

    Wadler introduced Classical Processes (CP), a calculus based on a propositions-as-types correspondence between propositions of classical linear logic and session types. Carbone et al. introduced Multiparty Classical Processes, a calculus that generalises CP to multiparty session types, by replacing...... the duality of classical linear logic (relating two types) with a more general notion of coherence (relating an arbitrary number of types). This paper introduces variants of CP and MCP, plus a new intermediate calculus of Globally-governed Classical Processes (GCP). We show a tight relation between...

  17. Location of collinear equilibrium points in the generalised ...

    African Journals Online (AJOL)

    We have discussed the location of collinear equilibrium points in the generalised photogravitational restricted three body problem. The problem is generalised in the sense that both primaries are oblate spheroid. They are source of radiation as well. We have found the solution for the location of collinear point L1. We found ...

  18. Generalising the staircase models

    International Nuclear Information System (INIS)

    Dorey, P.; Ravanini, F.

    1993-01-01

    Systems of integral equations are proposed which generalise those previously encountered in connection with the so-called staircase models. Under the assumption that these equations describe the finite-size effects of relativistic field theories via the thermodynamic Bethe ansatz, analytical and numerical evidence is given for the existence of a variety of new roaming renormalisation group trajectories. For each positive integer k and s=0, .., k-1, these is a one-parameter family of trajectories, passing close by the coset conformal field theories G (k) xG (nk+s) /G ((n+1)k+s) before finally flowing to a massive theory for s=0, or to another coset model for s.=|0. (orig.)

  19. Subsurface Shielding Source Term Specification Calculation

    International Nuclear Information System (INIS)

    S.Su

    2001-01-01

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M and O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M and O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations

  20. BWR Source Term Generation and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    J.C. Ryman

    2003-07-31

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operating conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the

  1. Source term estimation for small sized HTRs

    International Nuclear Information System (INIS)

    Moormann, R.

    1992-08-01

    Accidents which have to be considered are core heat-up, reactivity transients, water of air ingress and primary circuit depressurization. The main effort of this paper belongs to water/air ingress and depressurization, which requires consideration of fission product plateout under normal operation conditions; for the latter it is clearly shown, that absorption (penetration) mechanisms are much less important than assumed sometimes in the past. Source term estimation procedures for core heat-up events are shortly reviewed; reactivity transients are apparently covered by them. Besides a general literature survey including identification of areas with insufficient knowledge this paper contains some estimations on the thermomechanical behaviour of fission products in water in air ingress accidents. Typical source term examples are also presented. In an appendix, evaluations of the AVR experiments VAMPYR-I and -II with respect to plateout and fission product filter efficiency are outlined and used for a validation step of the new plateout code SPATRA. (orig.)

  2. Reevaluation of HFIR source term: Supplement 2

    International Nuclear Information System (INIS)

    Thomas, W.E.

    1986-11-01

    The HFIR source term has been reevaluated to assess the impact of the increase in core lifetime from 15 to 24 days. Calculations were made to determine the nuclide activities of the iodines, noble gases, and other fission products. The results show that there is no significant change in off-site dose due to the increased fuel cycle for the release scenario postulated in ORNL-3573

  3. Open quantum generalisation of Hopfield neural networks

    Science.gov (United States)

    Rotondo, P.; Marcuzzi, M.; Garrahan, J. P.; Lesanovsky, I.; Müller, M.

    2018-03-01

    We propose a new framework to understand how quantum effects may impact on the dynamics of neural networks. We implement the dynamics of neural networks in terms of Markovian open quantum systems, which allows us to treat thermal and quantum coherent effects on the same footing. In particular, we propose an open quantum generalisation of the Hopfield neural network, the simplest toy model of associative memory. We determine its phase diagram and show that quantum fluctuations give rise to a qualitatively new non-equilibrium phase. This novel phase is characterised by limit cycles corresponding to high-dimensional stationary manifolds that may be regarded as a generalisation of storage patterns to the quantum domain.

  4. Source terms in relation to air cleaning

    International Nuclear Information System (INIS)

    Bernero, R.M.

    1985-01-01

    There are two sets of source terms for consideration in air cleaning, those for routine releases and those for accident releases. With about 1000 reactor years of commercial operating experience in the US done, there is an excellent data base for routine and expected transient releases. Specifications for air cleaning can be based on this body of experience with confidence. Specifications for air cleaning in accident situations is another matter. Recent investigations of severe accident behavior are offering a new basis for source terms and air cleaning specifications. Reports by many experts in the field describe an accident environment notably different from previous models. It is an atmosphere heavy with aerosols, both radioactive and inert. Temperatures are sometimes very high; radioiodine is typically in the form of cesium iodide aerosol particles; other nuclides, such as tellurium, are also important aerosols. Some of the present air cleaning requirements may be very important in light of these new accident behavior models. Others may be wasteful or even counterproductive. The use of the new data on accident behavior models to reevaluate requirements promptly is discussed

  5. The Generalised Phase Contrast Method

    DEFF Research Database (Denmark)

    Glückstad, Jesper

    An analytic framework and a complete description for the design and optimisation of on-axis centred spatially filtering common path systems are presented. The Generalised Phase Contrast method is derived and introduced as the common denominator for these systems basically extending Zernike......’s original phase contrast scheme into a much wider range of operation and application. It is demonstrated that the Generalised Phase Contrast method can be successfully applied to the interpretation and subsequent optimisation of a number of different, commonly applied spatially filtering architectures...... designs and parameter settings. Finally, a number of original applications facilitated by the parallel light-beam encoding of the Generalised Phase Contrast method are briefly outlined. These include among others, wavefront sensing and generation, advanced usercontrolled optical micro...

  6. Long-term ecosystem level experiment at Toolik Lake, Alaska, and at Abisko, Northern Sweden: generalisations and differences in ecosystem and plant type responses to global change.

    NARCIS (Netherlands)

    van Wijk, M.T.; Clemmensen, K.E.; Shaver, G.R.; Williams, M.; Callaghan, T.V.; Chapin III, F.S.; Cornelissen, J.H.C.; Gough, L.; Hobbie, S.E.; Jonasson, S.; Lee, J.A.; Michelsen, A.; Press, M.C.; Richardson, S.J.; Rueth, H.

    2003-01-01

    Long-term ecosystem-level experiments, in which the environment is manipulated in a controlled manner, are important tools to predict the responses of ecosystem functioning and composition to future global change. We present the results of a meta-analysis performed on the results of long-term

  7. Influence of Chemistry on source term assessment

    International Nuclear Information System (INIS)

    Herranz Puebla, L.E.; Lopez Diez, I.; Rodriguez Maroto, J.J.; Martinez Lopez-Alcorocho, A.

    1991-01-01

    The major goal of a phenomenology analysis of containment during a severe accident situation can be splitedd into the following ones: to know the containment response to the different loads and to predict accurately the fission product and aerosol behavior. In this report, the main results coming from the study of a hypothetical accident scenario, based on LA-4 experiment of LACE project, are presented. In order to do it, several codes have been coupled: CONTEMPT4/MOD5 (thermalhydraulics), NAUA/MOD5 (aerosol physics) and IODE (iodine chemistry). 12 refs. It has been demonstrated the impossibility of assessing with confidence the Source Term if the chemical conduct of some radionuclides is not taken into account. In particular, the influence on the iodine retention efficiency of the sump of variables such as pH has been proven. (Author). 12 refs

  8. Generalised compositionality in graph transformation

    NARCIS (Netherlands)

    Ghamarian, A.H.; Rensink, Arend; Ehrig, H; Engels, G.; Kreowski, H.J.; Rozenberg, G.

    We present a notion of composition applying both to graphs and to rules, based on graph and rule interfaces along which they are glued. The current paper generalises a previous result in two different ways. Firstly, rules do not have to form pullbacks with their interfaces; this enables graph

  9. Dyads, a generalisation of monads

    NARCIS (Netherlands)

    Fokkinga, M.M.

    The concept of dyad is defined as the least common generalisation of monads and co-monads. So, taking some of the ingredients to be the identity, the concept specialises to the concept of monad, and taking other ingredients to be the identity it specialises to co-monads. Except for one axiom, all

  10. Source Term Model for Fine Particle Resuspension from Indoor Surfaces

    National Research Council Canada - National Science Library

    Kim, Yoojeong; Gidwani, Ashok; Sippola, Mark; Sohn, Chang W

    2008-01-01

    This Phase I effort developed a source term model for particle resuspension from indoor surfaces to be used as a source term boundary condition for CFD simulation of particle transport and dispersion in a building...

  11. Source term modelling parameters for Project-90

    International Nuclear Information System (INIS)

    Shaw, W.; Smith, G.; Worgan, K.; Hodgkinson, D.; Andersson, K.

    1992-04-01

    This document summarises the input parameters for the source term modelling within Project-90. In the first place, the parameters relate to the CALIBRE near-field code which was developed for the Swedish Nuclear Power Inspectorate's (SKI) Project-90 reference repository safety assessment exercise. An attempt has been made to give best estimate values and, where appropriate, a range which is related to variations around base cases. It should be noted that the data sets contain amendments to those considered by KBS-3. In particular, a completely new set of inventory data has been incorporated. The information given here does not constitute a complete set of parameter values for all parts of the CALIBRE code. Rather, it gives the key parameter values which are used in the constituent models within CALIBRE and the associated studies. For example, the inventory data acts as an input to the calculation of the oxidant production rates, which influence the generation of a redox front. The same data is also an initial value data set for the radionuclide migration component of CALIBRE. Similarly, the geometrical parameters of the near-field are common to both sub-models. The principal common parameters are gathered here for ease of reference and avoidance of unnecessary duplication and transcription errors. (au)

  12. Primary small bowel anastomosis in generalised peritonitis

    NARCIS (Netherlands)

    deGraaf, JS; van Goor, Harry; Bleichrodt, RP

    Objective: To find out if primary small bowel anastomosis of the bowel is safe in patients with generalised peritonitis who are treated by planned relaparotomies. Design: Retrospective study. Setting: University hospital, The Netherlands. Subjects. 10 Patients with generalised purulent peritonitis

  13. Generalised hypercementosis: a case report.

    Science.gov (United States)

    Seed, Rachel; Nixon, Paul P

    2004-10-01

    The following case report describes the clinical and radiographical presentation of a female who attended a general dental practice as a new patient. The patient was diagnosed with generalised hypercementosis, possibly attributable to oral neglect. Hypercementosis is associated with a number of aetiological factors, which may be local or systemic in nature. It is important that the general dental practitioner is aware of these factors and is able to distinguish presentation due to a local cause from that of a systemic disease process. The aims of this paper are to illustrate an unusual presentation of hypercementosis and to discuss the radiographic differentiation that led to diagnosis.

  14. Phase 1 immobilized low-activity waste operational source term

    International Nuclear Information System (INIS)

    Burbank, D.A.

    1998-01-01

    This report presents an engineering analysis of the Phase 1 privatization feeds to establish an operational source term for storage and disposal of immobilized low-activity waste packages at the Hanford Site. The source term information is needed to establish a preliminary estimate of the numbers of remote-handled and contact-handled waste packages. A discussion of the uncertainties and their impact on the source term and waste package distribution is also presented. It should be noted that this study is concerned with operational impacts only. Source terms used for accident scenarios would differ due to alpha and beta radiation which were not significant in this study

  15. Radiological and chemical source terms for Solid Waste Operations Complex

    International Nuclear Information System (INIS)

    Boothe, G.F.

    1994-01-01

    The purpose of this document is to describe the radiological and chemical source terms for the major projects of the Solid Waste Operations Complex (SWOC), including Project W-112, Project W-133 and Project W-100 (WRAP 2A). For purposes of this document, the term ''source term'' means the design basis inventory. All of the SWOC source terms involve the estimation of the radiological and chemical contents of various waste packages from different waste streams, and the inventories of these packages within facilities or within a scope of operations. The composition of some of the waste is not known precisely; consequently, conservative assumptions were made to ensure that the source term represents a bounding case (i.e., it is expected that the source term would not be exceeded). As better information is obtained on the radiological and chemical contents of waste packages and more accurate facility specific models are developed, this document should be revised as appropriate. Radiological source terms are needed to perform shielding and external dose calculations, to estimate routine airborne releases, to perform release calculations and dose estimates for safety documentation, to calculate the maximum possible fire loss and specific source terms for individual fire areas, etc. Chemical source terms (i.e., inventories of combustible, flammable, explosive or hazardous chemicals) are used to determine combustible loading, fire protection requirements, personnel exposures to hazardous chemicals from routine and accident conditions, and a wide variety of other safety and environmental requirements

  16. Threshold corrections, generalised prepotentials and Eichler integrals

    CERN Document Server

    Angelantonj, Carlo; Pioline, Boris

    2015-06-12

    We continue our study of one-loop integrals associated to BPS-saturated amplitudes in $\\mathcal{N}=2$ heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur-Poincar\\'e series in the complex structure modulus. The closure of Niebur-Poincar\\'e series under modular derivatives implies that such integrals derive from holomorphic prepotentials $f_n$, generalising the familiar prepotential of $\\mathcal{N}=2$ supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involv...

  17. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We introduce a new class of Cox cluster processes called generalised shot-noise processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results...... for first and second order moment measures, reduced Palm distributions, the -function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation...

  18. Generalised shot noise Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    2005-01-01

    We introduce a class of cox cluster processes called generalised shot noise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process that drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can...... be random. Thereby, a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and, second, on how to simulate such processes. In particular......, results on first- and second-order moment measures, reduced Palm distributions, the J-function, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified in important special cases of GSNCPs, and we discuss...

  19. Young Australian Indigenous Students' Growing Pattern Generalisations: The Role of Gesture When Generalising

    Science.gov (United States)

    Miller, Jodie

    2014-01-01

    This paper explores how young Indigenous students' (Year 2 and 3) generalise growing patterns. Piagetian clinical interviews were conducted to determine how students articulated growing pattern generalisations. Two case studies are presented displaying how students used gesture to support and articulate their generalisations of growing patterns.…

  20. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  1. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  2. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  3. Wagner’s theory of generalised heaps

    CERN Document Server

    Hollings, Christopher D

    2017-01-01

    The theories of V. V. Wagner (1908-1981) on abstractions of systems of binary relations are presented here within their historical and mathematical contexts. This book contains the first translation from Russian into English of a selection of Wagner’s papers, the ideas of which are connected to present-day mathematical research. Along with a translation of Wagner’s main work in this area, his 1953 paper ‘Theory of generalised heaps and generalised groups,’ the book also includes translations of three short precursor articles that provide additional context for his major work. Researchers and students interested in both algebra (in particular, heaps, semiheaps, generalised heaps, semigroups, and groups) and differential geometry will benefit from the techniques offered by these translations, owing to the natural connections between generalised heaps and generalised groups, and the role played by these concepts in differential geometry. This book gives examples from present-day mathematics where ideas r...

  4. The latest results from source term research. Overview and outlook

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, Luis E. [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica (CIEMAT), Madrid (Spain); Haste, Tim [Centre d' Etudes de Cadarache, Paul-Lez-Durance (France). Institut de Radioprotection et de Surete Nucleaire (IRSN); Kaerkelae, Teemu [VTT Technical Research Centre of Finland Ltd, Espoo (Finland)

    2016-12-15

    Source term research has continued internationally for more than 30 years, increasing confidence in calculations of the potential radioactive release to the environment after a severe reactor accident. Important experimental data have been obtained, mainly under international frameworks such as OECD/NEA and EURATOM. Specifically, Phebus FP provides major insights into fission product release and transport. Results are included in severe accident analysis codes. Data from international projects are being interpreted with a view to further improvements in these codes. This paper synthesizes the recent main outcomes from source term research on these topics, and on source term mitigation. It highlights knowledge gaps remaining and discusses ways to proceed. Aside from this further knowledge-driven research, there is consensus on the need to assess the source term predictive ability of current system codes, taking account of scale-up from experiment to reactor conditions.

  5. Revised accident source terms for light-water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Soffer, L. [Nuclear Regulatory Commission, Washington, DC (United States)

    1995-02-01

    This paper presents revised accident source terms for light-water reactors incorporating the severe accident research insights gained in this area over the last 15 years. Current LWR reactor accident source terms used for licensing date from 1962 and are contained in Regulatory Guides 1.3 and 1.4. These specify that 100% of the core inventory of noble gases and 25% of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental iodine. These assumptions have strongly affected present nuclear air cleaning requirements by emphasizing rapid actuation of spray systems and filtration systems optimized to retain elemental iodine. A proposed revision of reactor accident source terms and some im implications for nuclear air cleaning requirements was presented at the 22nd DOE/NRC Nuclear Air Cleaning Conference. A draft report was issued by the NRC for comment in July 1992. Extensive comments were received, with the most significant comments involving (a) release fractions for both volatile and non-volatile species in the early in-vessel release phase, (b) gap release fractions of the noble gases, iodine and cesium, and (c) the timing and duration for the release phases. The final source term report is expected to be issued in late 1994. Although the revised source terms are intended primarily for future plants, current nuclear power plants may request use of revised accident source term insights as well in licensing. This paper emphasizes additional information obtained since the 22nd Conference, including studies on fission product removal mechanisms, results obtained from improved severe accident code calculations and resolution of major comments, and their impact upon the revised accident source terms. Revised accident source terms for both BWRS and PWRS are presented.

  6. Private Placements as Sources of Long Term Funds for publicly ...

    African Journals Online (AJOL)

    Private Placements as Sources of Long Term Funds for publicly quoted firms in the Nigerian Capital Market. ... AFRREV IJAH: An International Journal of Arts and Humanities ... Abstract. Private placements are gradually becoming means of raising long-term funds in the Nigerian capital market by publicly quoted companies.

  7. Bayesian source term determination with unknown covariance of measurements

    Science.gov (United States)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  8. Utility view of the source term and air cleaning

    International Nuclear Information System (INIS)

    Littlefield, P.S.

    1985-01-01

    The utility view of the source term and air cleaning is discussed. The source term is made up of: (1) noble gases, which there has been a tendency to ignore in the past because it was thought there was nothing that could be done with them anyway, (2) the halogens, which have been dealt with in Air Cleaning Conferences in the past in terms of charcoal and other systems for removing them, and (3) the solid components of the source term which particulate filters are designed to handle. Air cleaning systems consist of filters, adsorbers, containment sprays, suppression pools in boiling water reactors and ice beds in ice condenser-equipped plants. The feasibility and cost of air cleaning systems are discussed

  9. ASOURCE: Source Term Analysis Tool for Advanced Fuel Cycle

    International Nuclear Information System (INIS)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Jong Won; Choi, Heui Joo; Jeong, Jong Tae

    2012-01-01

    In 2007, the 3 rd Comprehensive Nuclear Energy Promotion Plan, passed at the 254 th meeting of the Atomic Energy Commission, was announced as an R and D action plan for the development of an advanced fuel cycle adopting a sodium-cooled fast reactor (SFR) in connection with a pyroprocess for a sustainable stable energy supply and a reduction in the amount of spent fuel (SF). It is expected that this fuel cycle can greatly reduce the SF inventory through a recycling process in which transuranics (TRU) and long-lived nuclides are burned in the SFR and cesium and strontium are disposed of after sufficient interim storage. For the success of the R and D plan, there are several issues related to the source term analysis. These are related with the following: (a) generation of inflow and outflow source terms of mixed SF in each process for the design of the pyroprocess facility, (b) source terms of mixed radwaste in a canister for the design of storage and disposal systems, (c) overall inventory estimation for TRU and long-lived nuclides for the design of the SFR, and (d) best estimate source terms for the practical design of the interim storage facility of SFs. A source term evaluation for a SF or radwaste with a single irradiation profile can be easily accomplished with the conventional computation tool. However, source term assessment for a batch of SFs or a mixture of radwastes generated from SFs with different irradiation profiles. A task that is essential to support the aforementioned activities is not possible with the conventional tool. Therefore, hybrid computing program for source term analysis to support the advanced fuel cycle was developed

  10. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  11. GENERALISATION OF SUBMARINE FEATURES ON NAUTICAL CHARTS

    Directory of Open Access Journals (Sweden)

    E. Guilbert

    2012-07-01

    Full Text Available On most large scale and middle scale maps, relief is represented by contours and spot heights. In order to adapt the representation to the scale, the terrain is generalised either by smoothing or filtering the terrain model or by simplifying the contours. However this approach is not applicable to nautical chart construction where terrain features are selected according to their importance for navigation. This paper presents an approach for the consideration of feature attributes in the generalisation of a set of contours with respect to nautical chart constraints. Features are defined by sets of contours and a set of generalisation operators applied to features is presented. The definitions are introduced in a multi-agent system in order to perform automatic generalisation of a contour set. Results are discussed on a case study and directions for future work are presented.

  12. Actinide Source Term Program, position paper. Revision 1

    International Nuclear Information System (INIS)

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-01-01

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA open-quotes expert panelclose quotes model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the open-quotes inventory limitsclose quotes model is the only existing defensible model for the actinide source term. The model effort in progress, open-quotes chemical modeling of mobile actinide concentrationsclose quotes, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the open-quotes Inventory limitsclose quotes model

  13. Cloverleaf skull with generalised bone dysplasia

    Energy Technology Data Exchange (ETDEWEB)

    Kozlowski, K.; Warren, P.S.; Fisher, C.C.

    1985-09-01

    A case of cloverleaf skull with generalised bone dysplasia is reported. The authors believe that bone dysplasia associated with cloverleaf is neither identical with thanatophoric dysplasia nor achondroplasia. Until identity of thanatophoric dysplasia and cloverleaf skull with generalised bone dysplasia is proved the diseases should be looked upon as separate entities and the wording ''thanatophoric dysplasia with cloverleaf skull'' should be abolished.

  14. Directional Unfolded Source Term (DUST) for Compton Cameras.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Dean J.; Mitchell, Dean J.; Horne, Steven M.; O' Brien, Sean; Thoreson, Gregory G

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  15. A Bayesian Algorithm for Assessing Uncertainty in Radionuclide Source Terms

    Science.gov (United States)

    Robins, Peter

    2015-04-01

    Inferring source term parameters for a radionuclide release is difficult, due to the large uncertainties in forward dispersion modelling as a consequence of imperfect knowledge pertaining to wind vector fields and turbulent diffusion in the Earth's atmosphere. Additional sources of error include the radionuclide measurements obtained from sensors. These measurements may either be subject to random fluctuations or are simple indications that the true, unobserved quantity is below a detection limit. Consequent large reconstruction uncertainties can render a "best" estimate meaningless. A Markov Chain Monte Carlo (MCMC) Bayesian Algorithm is presented that attempts to account for uncertainties in atmospheric transport modelling and radionuclide sensor measurements to quantify uncertainties in radionuclide release source term parameters. Prior probability distributions are created for likely release locations at existing nuclear facilities and seismic events. Likelihood models are constructed using CTBTO adjoint modelling output and probability distributions of sensor response. Samples from the resulting multi-isotope source term parameters posterior probability distribution are generated that can be used to make probabilistic statements about the source term. Examples are given of marginal probability distributions obtained from simulated sensor data. The consequences of errors in numerical weather prediction wind fields are demonstrated with a reconstruction of the Fukushima nuclear reactor accident from International Monitoring System radionuclide particulate sensor data.

  16. Flowsheets and source terms for radioactive waste projections

    International Nuclear Information System (INIS)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF 6 conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables

  17. Common Calibration Source for Monitoring Long-term Ozone Trends

    Science.gov (United States)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  18. Threshold corrections, generalised prepotentials and Eichler integrals

    Directory of Open Access Journals (Sweden)

    Carlo Angelantonj

    2015-08-01

    Full Text Available We continue our study of one-loop integrals associated to BPS-saturated amplitudes in N=2 heterotic vacua. We compute their large-volume behaviour, and express them as Fourier series in the complexified volume, with Fourier coefficients given in terms of Niebur–Poincaré series in the complex structure modulus. The closure of Niebur–Poincaré series under modular derivatives implies that such integrals derive from holomorphic prepotentials fn, generalising the familiar prepotential of N=2 supergravity. These holomorphic prepotentials transform anomalously under T-duality, in a way characteristic of Eichler integrals. We use this observation to compute their quantum monodromies under the duality group. We extend the analysis to modular integrals with respect to Hecke congruence subgroups, which naturally arise in compactifications on non-factorisable tori and freely-acting orbifolds. In this case, we derive new explicit results including closed-form expressions for integrals involving the Γ0(N Hauptmodul, a full characterisation of holomorphic prepotentials including their quantum monodromies, as well as concrete formulæ for holomorphic Yukawa couplings.

  19. Effect Displays in R for Generalised Linear Models

    Directory of Open Access Journals (Sweden)

    John Fox

    2003-07-01

    Full Text Available This paper describes the implementation in R of a method for tabular or graphical display of terms in a complex generalised linear model. By complex, I mean a model that contains terms related by marginality or hierarchy, such as polynomial terms, or main effects and interactions. I call these tables or graphs effect displays. Effect displays are constructed by identifying high-order terms in a generalised linear model. Fitted values under the model are computed for each such term. The lower-order "relatives" of a high-order term (e.g., main effects marginal to an interaction are absorbed into the term, allowing the predictors appearing in the high-order term to range over their values. The values of other predictors are fixed at typical values: for example, a covariate could be fixed at its mean or median, a factor at its proportional distribution in the data, or to equal proportions in its several levels. Variations of effect displays are also described, including representation of terms higher-order to any appearing in the model.

  20. Perspectives on source terms based on early research and development

    International Nuclear Information System (INIS)

    Pressesky, A.J.

    1985-07-01

    This report presents an overview of the key documentation of the research and development programs relevant to the source term issue which were undertaken by the Atomic Energy Commission between 1950 and 1970. The source term is taken to be the amount, composition (physical and chemical), and timing of the projected release of radioactivity to the environment in the hypothetical event of a severe reactor accident in a light water reactor of the type currently being licensed, built and operated. The objective is to illuminate and provide perspectives on (a) the maturity of the technical data base and the analytical methodology, (b) the extent to which remaining conservatisms can be applied to compensate for uncertainties, (c) the purpose for which the technology and methodology will be used, and (d) the need to keep problems and uncertainties in proper perspective. Comments that can provide some context for the difficult programmatic choices to be made are included, and technical considerations that may be inadequately applied or neglected in some current source term calculations were studied. This review has not uncovered any significant technical considerations that have been omitted or are being inadequately treated in current source term analyses, except perhaps the contribution made to in-containment aerosols by coolant comminution upon escape at pressure from the reactor coolant system. 11 refs

  1. Radionuclide source term reconstruction at the feed materials production center

    International Nuclear Information System (INIS)

    Hill, C.A.

    1991-01-01

    In 1987 the US Department of Energy Feed Materials production Center (FMPC) published the History of FMPC Radionuclide Discharges to define the total quantity of radionuclides emitted via the air pathway from the FMPC since operations began in 1951. This source term was based on historical records of measured dust collector and scrubber losses. Concurrently, the local population expressed concerns about potential health effects from living near the facility and requested that the Centers for Disease Control (CDC) perform an epidemiological study to determine the health effects from living near the FMPC. A more complete assessment of the source term was required for the CDC to determine whether an epidemiological study was warranted. The paper shows annual FMPC uranium emissions. The source term reconstruction required 9 months and ∼ 3.5 person-years. In all cases where several possible methods were available, effort was taken to ensure that the estimate would be conservatively high. The reported uranium losses represent 0.026% of the total uranium received at the FMPC from 1951 to 1987, and when an upper error bound is applied to the emission total, the losses represent 0.036% of the total uranium receipts. The resulting source term is considered by FMPC personnel to be as accurate a reconstruction as is possible using the limited historical data available

  2. Disposal Unit Source Term (DUST) data input guide

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1993-05-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). The computer code DUST (Disposal Unit Source Term) has been developed to model these processes. This document presents the models used to calculate release from a disposal facility, verification of the model, and instructions on the use of the DUST code. In addition to DUST, a preprocessor, DUSTIN, which helps the code user create input decks for DUST and a post-processor, GRAFXT, which takes selected output files and plots them on the computer terminal have been written. Use of these codes is also described

  3. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  4. Generalisability of a composite student selection programme

    DEFF Research Database (Denmark)

    O'Neill, Lotte Dyhrberg; Korsholm, Lars; Wallstedt, Birgitta

    2009-01-01

    OBJECTIVES The reliability of individual non-cognitive admission criteria in medical education is controversial. Nonetheless, non-cognitive admission criteria appear to be widely used in selection to medicine to supplement the grades of qualifying examinations. However, very few studies have...... examined the overall test generalisability of composites of non-cognitive admission variables in medical education. We examined the generalisability of a composite process for selection to medicine, consisting of four variables: qualifications (application form information); written motivation (in essay...... format); general knowledge (multiple-choice test), and a semi-structured admission interview. The aim of this study was to estimate the generalisability of a composite selection. METHODS: Data from 307 applicants who participated in the admission to medicine in 2007 were available for analysis. Each...

  5. A study of idiopathic generalised epilepsy in an Irish population.

    LENUS (Irish Health Repository)

    Mullins, G M

    2012-02-03

    Idiopathic generalised epilepsy (IGE) is subdivided into syndromes based on clinical and EEG features. PURPOSE: The aim of this study was to characterise all cases of IGE with supportive EEG abnormalities in terms of gender differences, seizure types reported, IGE syndromes, family history of epilepsy and EEG findings. We also calculated the limited duration prevalence of IGE in our cohort. METHODS: Data on abnormal EEGs were collected retrospectively from two EEG databases at two tertiary referral centres for neurology. Clinical information was obtained from EEG request forms, standardised EEG questionnaires and medical notes of patients. RESULTS: two hundred twenty-three patients met our inclusion criteria, 89 (39.9%) male and 134 (60.1%) females. Tonic clonic seizures were the most common seizure type reported, 162 (72.65%) having a generalised tonic clonic seizure (GTCS) at some time. IGE with GTCS only (EGTCSA) was the most common syndrome in our cohort being present in 94 patients (34 male, 60 female), with 42 (15 male, 27 female) patients diagnosed with Juvenile myoclonic epilepsy (JME), 23 (9 male, 14 female) with Juvenile absence epilepsy (JAE) and 20 (9 male, 11 female) with childhood absence epilepsy (CAE). EEG studies in all patients showed generalised epileptiform activity. CONCLUSIONS: More women than men were diagnosed with generalised epilepsy. Tonic clonic seizures were the most common seizure type reported. EGTCSA was the most frequent syndrome seen. Gender differences were evident for JAE and JME as previously reported and for EGTCSA, which was not reported to date, and reached statistical significance for EGTCA and JME.

  6. Phonatory sound sources in terms of Lagrangian Coherent Structures

    Science.gov (United States)

    McPhail, Michael; Krane, Michael

    2015-11-01

    Lagrangian Coherent Structures (LCS) are used to identify sound sources in phonation. Currently, it is difficult to causally relate changes in airflow topology from voice disorders to changes in voiced sound production. LCS reveals a flow's topology by decomposing the flow into regions of distinct dynamics. The aeroacoustic sources can be written in terms of the motion of these regions in terms of the motion of the boundaries of the distinct regions. Breaking down the flow into constituent parts shows how each distinct region contributes to sound production. This approach provides a framework to connect changes in anatomy from a voice disorder to measurable changes in the resulting sound. This approach is presented for simulations of some canonical cases of vortex sound generation, and a two-dimensional simulation of phonation. Acknowledge NIH grant 2R01 2R01DC005642.

  7. Realistic minimum accident source terms - Evaluation, application, and risk acceptance

    International Nuclear Information System (INIS)

    Angelo, P. L.

    2009-01-01

    The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)

  8. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  9. Basic repository source term and data sheet report: Lavender Canyon

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs

  10. Basic repository source term and data sheet report: Davis Canyon

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water electricity, and natural gas. Data are presented for construction and operation at an assumed site in Davis Canyon, Utah. 6 tabs

  11. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Frederick, Jennifer M

    2018-03-01

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  12. Balanced source terms for wave generation within the Hasselmann equation

    Directory of Open Access Journals (Sweden)

    V. Zakharov

    2017-10-01

    Full Text Available The new Zakharov–Resio–Pushkarev (ZRP wind input source term Zakharov et al.(2012 is examined for its theoretical consistency via numerical simulation of the Hasselmann equation. The results are compared to field experimental data, collected at different sites around the world, and theoretical predictions based on self-similarity analysis. Consistent results are obtained for both limited fetch and duration limited statements.

  13. Fission product source terms and engineered safety features

    International Nuclear Information System (INIS)

    Malinauskas, A.P.

    1984-01-01

    The author states that new, technically defensible, methodologies to establish realistic source term values for nuclear reactor accidents will soon be available. Although these methodologies will undoubtedly find widespread use in the development of accident response procedures, the author states that it is less clear that the industry is preparing to employ the newer results to develop a more rational approach to strategies for the mitigation of fission product releases. Questions concerning the performance of existing engineered safety systems are reviewed

  14. Exactly marginal deformations from exceptional generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford,Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Gabella, Maxime [Institute for Advanced Study,Einstein Drive, Princeton, NJ 08540 (United States); Graña, Mariana [Institut de Physique Théorique, CEA/Saclay,91191 Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Université, UPMC Paris 05, UMR 7589, LPTHE,75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2017-01-27

    We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS{sub 5} flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS{sub 5} flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.

  15. On Generalisation of Polynomials in Complex Plane

    Directory of Open Access Journals (Sweden)

    Maslina Darus

    2010-01-01

    Full Text Available The generalised Bell and Laguerre polynomials of fractional-order in complex z-plane are defined. Some properties are studied. Moreover, we proved that these polynomials are univalent solutions for second order differential equations. Also, the Laguerre-type of some special functions are introduced.

  16. Acute generalised exanthematous pustulosis secondary to ...

    African Journals Online (AJOL)

    2012-11-02

    Nov 2, 2012 ... Superficial dermal vessels were mildly dilated and contained marginated neutrophils. Special stains for fungi and acid- fast bacilli were negative and no granulomas, dysplastic or malignant cells were found. A histopathological diagnosis of acute generalised exanthematous pustulosis (AGEP) was made.

  17. The oculocerebral syndrome in association with generalised ...

    African Journals Online (AJOL)

    A 14-year-old girl with generalised hypopigmentation, mental retardation, abnormal movements, and ocular anomalies is described. It is suggested that she represents a further case of oculocerebral albinism, a rare autosomal recessive condition. Reference is made to previous similar cases.

  18. Exactly marginal deformations from exceptional generalised geometry

    International Nuclear Information System (INIS)

    Ashmore, Anthony; Gabella, Maxime; Graña, Mariana; Petrini, Michela; Waldram, Daniel

    2017-01-01

    We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS 5 flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS 5 flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.

  19. Generalised phase contrast: microscopy, manipulation and more

    DEFF Research Database (Denmark)

    Palima, Darwin; Glückstad, Jesper

    2010-01-01

    Generalised phase contrast (GPC) not only leads to more accurate phase imaging beyond thin biological samples, but serves as an enabling framework in developing tools over a wide spectrum of contemporary applications in optics and photonics, including optical trapping and micromanipulation, optic...

  20. Hyperscaling violating solutions in generalised EMD theory

    Directory of Open Access Journals (Sweden)

    Li Li

    2017-04-01

    Full Text Available This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  1. Hyperscaling violating solutions in generalised EMD theory

    Energy Technology Data Exchange (ETDEWEB)

    Li, Li, E-mail: lil416@lehigh.edu [Crete Center for Theoretical Physics, Institute for Theoretical and Computational Physics, Department of Physics, University of Crete, 71003 Heraklion (Greece); Crete Center for Quantum Complexity and Nanotechnology, Department of Physics, University of Crete, 71003 Heraklion (Greece); Department of Physics, Lehigh University, Bethlehem, PA, 18018 (United States)

    2017-04-10

    This short note is devoted to deriving scaling but hyperscaling violating solutions in a generalised Einstein–Maxwell-Dilaton theory with an arbitrary number of scalars and vectors. We obtain analytic solutions in some special case and discuss the physical constraints on the allowed parameter range in order to have a well-defined holographic ground-state solution.

  2. Trace Metal Source Terms in Carbon Sequestration Environments

    Energy Technology Data Exchange (ETDEWEB)

    Karamalidis, Athanasios K; Torres, Sharon G; Hakala, J Alexandra; Shao, Hongbo; Cantrell, Kirk J; Carroll, Susan

    2012-02-05

    Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising, however, possible CO₂ or CO₂-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define trace metal source terms from the reaction of supercritical CO₂, storage reservoir brines, reservoir and cap rocks. Storage reservoir source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from sandstones, shales, carbonates, evaporites, basalts and cements from the Frio, In Salah, Illinois Basin – Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution is tracked by measuring solution concentrations over time under conditions (e.g. pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for Maximum Contaminant Levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments due to the presence of CO₂. Results indicate that Cr and Pb released from sandstone reservoir and shale cap rock exceed the MCLs by an order of magnitude while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the reservoir and caprock source term to further evaluate the impact of leakage on groundwater quality.

  3. Background and Source Term Identification in Active Neutron Interrogation Methods

    Science.gov (United States)

    2011-03-24

    background source terms during active neutron interrogation. Oxide Percent SiO2 60.6 Al2O3 15.9 CaO 6.4 MgO 4.7 Na2O 3.1 Fe 6.7 K2O 1.8 TiO2 0.7... P2O5 0.1 Table 5. Chemical Properties of Continental Crust Provides the average amount of each element present in the earth’s crust for

  4. NRC source term assessment for incident response dose projections

    International Nuclear Information System (INIS)

    Easley, P.; Pasedag, W.

    1984-01-01

    The NRC provides advice and assistance to licensees and State and local authorities in responding to accidents. The TACT code supports this function by providing source term projections for two situations during early (15 to 60 minutes) accident response: (1) Core/containment damage is indicated, but there are no measured releases. Quantification of a predicted release permits emergency response before people are exposed. With TACT, response personnel can estimate releases based on fuel and cladding conditions, coolant boundary and containment integrity, and mitigative systems operability. For this type of estimate, TACT is intermediate between default assumptions and time-consuming mechanistic codes. (2) A combination of plant status and limited release data are available. For this situation, iterations between predictions based on known conditions which are compared to measured releases gives reasonable confidence in supplemental source term information otherwise unavailable: nuclide mix, releases not monitored, and trending or abrupt changes. The assumptions and models used in TACT, and examples of its use, are given in this paper

  5. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  6. Atucha-I source terms for sequences initiated by transients

    International Nuclear Information System (INIS)

    Baron, J.; Bastianelli, B.

    1997-01-01

    The present work is part of an expected source terms study in the Atucha I nuclear power plant during severe accidents. From the accident sequences with a significant probability to produce core damage, those initiated by operational transients have been identified as the most relevant. These sequences have some common characteristics, in the sense that all of them resume in the opening of the primary system safety valves, and leave this path open for the coolant loss. In the case these sequences continue as severe accidents, the same path will be used for the release of the radionuclides, from the core, through the primary system and to the containment. Later in the severe accident sequence, the failure of the pressure vessel will occur, and the corium will fall inside the reactor cavity, interacting with the concrete. During these processes, more radioactive products will be released inside the containment. In the present work the severe accident simulation initiated by a blackout is performed, from the point of view of the phenomenology of the behavior of the radioactive products, as they are transported in the piping, during the core-concrete interactions, and inside the containment buildings until it failure. The final result is the source term into the atmosphere. (author) [es

  7. Use of source term uncoupled in radionuclide migration equations

    International Nuclear Information System (INIS)

    Silveira, Claudia Siqueira da; Lima, Zelmo Rodrigues de; Alvim, Antonio Carlos Marques

    2008-01-01

    Final repositories of high-level radioactive waste have been considered in deep, low permeability and stable geological formations. A common problem found is the migration modeling of radionuclides in a fractured rock. In this work, the physical system adopted consists of the rock matrix containing a single planar fracture situated in water saturated porous rock. The partial differential equations that describe the radionuclide transport were discretized using finite differences techniques, of which the following methods were adopted: Explicit Euler, Implicit Euler and Crank-Nicholson. For each one of these methods, the advective term was discretized with the following numerical schemes: backward differences, centered differences and forward differences. We make a comparison to determine which temporal and space discretization has the best result in relation to a reference solution. The obtained results show that the Explicit Euler Method with forward discretization in the advective term has a good accuracy. Next, with the objective of improving the answer of the Implicit Euler and Crank-Nicholson Methods it was accomplished a source term uncouplement, the diffusive flux. The obtained results were considered satisfactory by comparison with previous studies. (author)

  8. Quantum mechanics of a generalised rigid body

    International Nuclear Information System (INIS)

    Gripaios, Ben; Sutherland, Dave

    2016-01-01

    We consider the quantum version of Arnold’s generalisation of a rigid body in classical mechanics. Thus, we quantise the motion on an arbitrary Lie group manifold of a particle whose classical trajectories correspond to the geodesics of any one-sided-invariant metric. We show how the derivation of the spectrum of energy eigenstates can be simplified by making use of automorphisms of the Lie algebra and (for groups of type I) by methods of harmonic analysis. We show how the method can be extended to cosets, generalising the linear rigid rotor. As examples, we consider all connected and simply connected Lie groups up to dimension 3. This includes the universal cover of the archetypical rigid body, along with a number of new exactly solvable models. We also discuss a possible application to the topical problem of quantising a perfect fluid. (paper)

  9. Support vector machines and generalisation in HEP

    Science.gov (United States)

    Bevan, Adrian; Gamboa Goñi, Rodrigo; Hays, Jon; Stevenson, Tom

    2017-10-01

    We review the concept of Support Vector Machines (SVMs) and discuss examples of their use in a number of scenarios. Several SVM implementations have been used in HEP and we exemplify this algorithm using the Toolkit for Multivariate Analysis (TMVA) implementation. We discuss examples relevant to HEP including background suppression for H → τ + τ ‑ at the LHC with several different kernel functions. Performance benchmarking leads to the issue of generalisation of hyper-parameter selection. The avoidance of fine tuning (over training or over fitting) in MVA hyper-parameter optimisation, i.e. the ability to ensure generalised performance of an MVA that is independent of the training, validation and test samples, is of utmost importance. We discuss this issue and compare and contrast performance of hold-out and k-fold cross-validation. We have extended the SVM functionality and introduced tools to facilitate cross validation in TMVA and present results based on these improvements.

  10. Quantum field theory in generalised Snyder spaces

    International Nuclear Information System (INIS)

    Meljanac, S.; Meljanac, D.; Mignemi, S.; Štrajn, R.

    2017-01-01

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  11. Quantum field theory in generalised Snyder spaces

    Energy Technology Data Exchange (ETDEWEB)

    Meljanac, S.; Meljanac, D. [Rudjer Bošković Institute, Bijenička cesta 54, 10002 Zagreb (Croatia); Mignemi, S., E-mail: smignemi@unica.it [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy); Štrajn, R. [Dipartimento di Matematica e Informatica, Università di Cagliari, viale Merello 92, 09123 Cagliari (Italy); INFN, Sezione di Cagliari, Cittadella Universitaria, 09042 Monserrato (Italy)

    2017-05-10

    We discuss the generalisation of the Snyder model that includes all possible deformations of the Heisenberg algebra compatible with Lorentz invariance and investigate its properties. We calculate perturbatively the law of addition of momenta and the star product in the general case. We also undertake the construction of a scalar field theory on these noncommutative spaces showing that the free theory is equivalent to the commutative one, like in other models of noncommutative QFT.

  12. Source term identification in atmospheric modelling via sparse optimization

    Science.gov (United States)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  13. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  14. Influence of iodine chemistry on source term assessment

    International Nuclear Information System (INIS)

    Herranz Puebla, L. E.; Lopez Diez, I.; Rodriguez Maroto, J. J.; Martinez Lopez-Alcorocho, A.

    1991-01-01

    The major goal of a phenomenology analysis of containment during a severe accident situation can be spitted into the following ones: to know the containment response to the different loads and to predict accurately the fission product and aerosol behavior. In this report, the main results coming from the study of a hypothetical accident scenario, based on LA-4 experiment of LACE project, are presented. In order to do it, several codes have been coupled: CONTEMPT4/MOD5 (thermohydraulics), NAUA/MOD5 (aerosol physics) and IODE (iodine chemistry). It has been demonstrated the impossibility of assessing with confidence the Source Term if the chemical conduct of some radionuclides is not taken into account. In particular, the influence on the iodine retention efficiency of the sump of variables such as pH has been proven. (Author)12 refs

  15. The EC CAST project (carbon-14 source term)

    International Nuclear Information System (INIS)

    Williams, S. J.

    2015-01-01

    Carbon-14 is a key radionuclide in the assessment of the safety of underground geological disposal facilities for radioactive wastes. It is possible for carbon-14 to be released from waste packages in a variety of chemical forms, both organic and inorganic, and as dissolved or gaseous species The EC CAST (CArbon-14 Source Term) project aims to develop understanding of the generation and release of carbon-14 from radioactive waste materials under conditions relevant to packaging and disposal. It focuses on the release of carbon-14 from irradiated metals (steels and zirconium alloys), from irradiated graphite and from spent ion-exchange resins. The CAST consortium brings together 33 partners. CAST commenced in October 2013 and this paper describes progress to March 2015. The main activities during this period were reviews of the current status of knowledge, the identification and acquisition of suitable samples and the design of experiments and analytical procedures. (authors)

  16. Potential Regulatory Use of New Accident Source Term Information

    International Nuclear Information System (INIS)

    Lee, Jay

    1986-01-01

    Accident release estimates have been used in the regulatory process for more than two decades. Many parts of the process are based upon release assumptions contained in the 1962 document 'Calculation of Distance Factors for Power and Test Reactor Sites', which forms the basis for 10 CSFR 100, as well as those based upon the more recent 1975 'Reactor Safety Study' risk estimates. Examples of regulatory used of TID-14844 include containment performance, environmental qualification of equipment, air filtration and other fission product mitigation systems, accident monitoring onsite and offsite and siting. Examples of regulatory uses of WASH-1400 release estimates include emergency planning, evaluating offsite impacts and risks for such uses as Environmental Impact Statements, assessing offsite contamination and recovery, evaluating standard plant designs, and investigating new regulatory requirements. In carrying out the Severe Accident Policy Implementation Program, the U. S. NRC staff expects to propose a number of changes to rules as well as other changes in regulatory practice. These changes could arise from research regarding radioactivity releases under severe accident conditions, as well as other insights expected to be gained through the evaluation of severe accidents. A number of changes in rules and regulatory practices can be expected from our improved understanding arising from the extensive conditions. The implementation of such changes may require a capability to perform source term calculations, selection of a regulatory principle, or framework, in connection with evaluation of plants beyond the current design basis, development of new forms of source terms, and revision of the affected rules and other regulatory practices

  17. Source term development for tritium at the Sheffield disposal site

    International Nuclear Information System (INIS)

    MacKenzie, D.R.; Barletta, R.E.; Smalley, J.F.; Kempf, C.R.; Davis, R.E.

    1984-01-01

    The Sheffield low-level radioactive waste disposal site, which ceased operation in 1978, has been the focus of modeling efforts by the NRC for the purpose of predicting long-term site behavior. To provide the NRC with the information required for its modeling effort, a study to define the source term for tritium in eight trenches at the Sheffield site has been undertaken. Tritium is of special interest since significant concentrations of the isotope have been found in groundwater samples taken at the site and at locations outside the site boundary. Previous estimates of tritium site inventory at Sheffield are in wide disagreement. In this study, the tritium inventory in the eight trenches was estimated by reviewing the radioactive shipping records (RSRs) for waste buried in these trenches. It has been found that the tritium shipped for burial at the site was probably higher than previously estimated. In the eight trenches surveyed, which amount to roughly one half the total volume and activity buried at Sheffield, approximately 2350 Ci of tritium from non-fuel cycle sources were identified. The review of RSRs also formed the basis for obtaining waste package descriptions and for contacting large waste generators to obtain more detailed information regarding these waste packages. As a result of this review and the selected generator contacts, the non-fuel cycle tritium waste was categorized. The tritium releases from each of these waste categories were modeled. The results of this modeling effort are presented for each of the eight trenches selected. 3 references, 2 figures

  18. 5.0. Depletion, activation, and spent fuel source terms

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  19. Generalised linear models for correlated pseudo-observations, with applications to multi-state models

    DEFF Research Database (Denmark)

    Andersen, Per Kragh; Klein, John P.; Rosthøj, Susanne

    2003-01-01

    Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model......Generalised estimating equation; Generalised linear model; Jackknife pseudo-value; Logistic regression; Markov Model; Multi-state model...

  20. Generalising the coupling between spacetime and matter

    Energy Technology Data Exchange (ETDEWEB)

    Carloni, Sante, E-mail: sante.carloni@gmail.com

    2017-03-10

    We explore the idea that the coupling between matter and spacetime is more complex than the one originally envisioned by Einstein. We propose that such coupling takes the form of a new fundamental tensor in the Einstein field equations. We then show that the introduction of this tensor can account for dark phenomenology in General Relativity, maintaining a weak field limit compatible with standard Newtonian gravitation. The same paradigm can be applied any other theory of gravitation. We show, as an example, that in the context of conformal gravity a generalised coupling is able to solve compatibility issues between the matter and the gravitational sector.

  1. Generalising the coupling between spacetime and matter

    Directory of Open Access Journals (Sweden)

    Sante Carloni

    2017-03-01

    Full Text Available We explore the idea that the coupling between matter and spacetime is more complex than the one originally envisioned by Einstein. We propose that such coupling takes the form of a new fundamental tensor in the Einstein field equations. We then show that the introduction of this tensor can account for dark phenomenology in General Relativity, maintaining a weak field limit compatible with standard Newtonian gravitation. The same paradigm can be applied any other theory of gravitation. We show, as an example, that in the context of conformal gravity a generalised coupling is able to solve compatibility issues between the matter and the gravitational sector.

  2. Generalised model for anisotropic compact stars

    Energy Technology Data Exchange (ETDEWEB)

    Maurya, S.K. [University of Nizwa, Department of Mathematical and Physical Sciences College of Arts and Science, Nizwa (Oman); Gupta, Y.K. [Raj Kumar Goel Institute of Technology, Department of Mathematics, Ghaziabad, Uttar Pradesh (India); Ray, Saibal [Government College of Engineering and Ceramic Technology, Department of Physics, Kolkata, West Bengal (India); Deb, Debabrata [Indian Institute of Engineering Science and Technology, Shibpur, Department of Physics, Howrah, West Bengal (India)

    2016-12-15

    In the present investigation an exact generalised model for anisotropic compact stars of embedding class 1 is sought with a general relativistic background. The generic solutions are verified by exploring different physical aspects, viz. energy conditions, mass-radius relation, stability of the models, in connection to their validity. It is observed that the model presented here for compact stars is compatible with all these physical tests and thus physically acceptable as far as the compact star candidates RXJ 1856-37, SAX J 1808.4-3658 (SS1) and SAX J 1808.4-3658 (SS2) are concerned. (orig.)

  3. Generalised empirical method for predicting surface subsidence

    International Nuclear Information System (INIS)

    Zhang, M.; Bhattacharyya, A.K.

    1994-01-01

    Based on a simplified strata parameter, i.e. the ratio of total thickness of the strong rock beds in an overburden to the overall thickness of the overburden, a Generalised Empirical Method (GEM) is described for predicting the maximum subsidence and the shape of a complete transverse subsidence profile due to a single completely extracted longwall panel. In the method, a nomogram for predicting the maximum surface subsidence is first developed from the data collected from subsidence measurements worldwide. Then, a method is developed for predicting the shapes of complete transfer subsidence profiles for a horizontal seam and ground surface and is verified by case studies. 13 refs., 9 figs., 2 tabs

  4. Source term analysis for a nuclear submarine accident

    International Nuclear Information System (INIS)

    Lewis, B.J.; Hugron, J.J.M.R.

    1999-01-01

    A source term analysis has been conducted to determine the activity release into the environment as a result of a large-break loss-of-coolant accident aboard a visiting nuclear-powered submarine to a Canadian port. This best-estimate analysis considers the fractional release from the core, and fission product transport in the primary heat transport system, primary containment (i.e. reactor compartment) and submarine hull. Physical removal mechanisms such as vapour and aerosol deposition are treated in the calculation. Since a thermalhydraulic analysis indicated that the integrity of the reactor compartment is maintained, release from the reactor compartment will only occur by leakage; however, it is conservatively assumed that the secondary containment is not isolated for a 24-h period where release occurs through an open hatch in the submarine hull. Consequently, during this period, the activity release into the atmosphere is estimated as 4.6 TBq, leading to a maximum individual dose equivalent of 0.5 mSv at 800 metres from the berthing location. This activity release is comparable to that obtained in the BEREX TSA study (for a similar accident scenario) but is four orders of magnitude less than that reported in the earlier Davis study where, unrealistically, no credit had been taken for the containment system or for any physical removal processes. (author)

  5. LMFBR source term experiments with rupture disk discharge under sodium

    International Nuclear Information System (INIS)

    Minges, J.; Schuetz, W.

    1993-05-01

    In the frame of the KfK research program FAUST, contributions are given to the assessment of the instantaneous source term in case of an LMFBR loss-of-flow accident with expanding fuel or sodium vapour. The main goal of the program is to achieve information, mainly by experiments, on the retention capability of the primary sodium pool for fuel and fission products. For that purpose, it is necessary to investigate the interaction of bubble and aerosol behaviour after a pressure discharge, and the subsequent aerosol transport. After a series of water tests (FAUST-1), rupture disk discharge tests under 500 C sodium up to 3.81 MPa were performed during the phase FAUST-2 with the two test facilities 2A (about 2 liters of sodium) and 2B (about 200 liters of sodium). The discharge tests were performed with pressurized argon gas and admixtures of the simulation materials Cs, Csl, Nal, l 2 , SrO, and UO 2 . Cs was a liquid, l 2 vapour, and all other substances solid particles. Besides UO 2 , non-radioactive material was used (natural isotopes). The retention capability of liquid sodium is expressed by retention factors RF. In general, RF is defined as the mass ratio of discharged amount, and the amount which is detected in the cover gas for the relevant species. From sampling immediately after the discharge, the instantaneous retention factors are deduced. From retarded sampling, the 'delayed factors' follow. High pressure discharge creates two important removal mechanisms, namely 'impaction by inertia relative to the bubble oscillations' and 'wash-out by sedimentation of entrained sodium droplets'. On the other hand, the retention of particles enclosed in a buoyantly rising bubble is significantly smaller. (orig.) [de

  6. The influence of source term release parameters on health effects

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Ha, Jae Joo

    1998-08-01

    In this study, the influence of source term release parameters on the health effects was examined. This is very useful in identifying the relative importance of release parameters and can be an important factor in developing a strategy for reducing offsite risks. The release parameters investigated in this study are release height, heat content, fuel burnup, release time, release duration, and warning time. The health effects affected by the change of release parameters are early fatalities, cancer fatalities, early injuries, cancer injuries, early fatality risk, population weighted early fatality risk, population weighted cancer fatality risk, effective whole body population dose, population exceeding an early acute red bone marrow dose of 1.5 Sv, and distance at which early fatalities are expected to occur. As release height increases, the values of early health effects such as early fatalities and injuries decrease. However, the release height dose not have significant influences on late health effects. The values of both early and late health effects decrease as heat content increases. The increase fuel burnup, i.e., the increase of core inventories increases the late health effects, however, has small influence on the early health effects. But, the number of early injuries increases as the fuel burnup increases. The effects of release time increase shows very similar influence on both the early and late health effects. As the release time increases to 2 hours, the values of health effects increase and then decrease rapidly. As release duration increases, the values of late health effects increase slightly, however, the values of early health effects decrease. As warning time increases to 2 hours, the values of late health effects decrease and then shows no variation. The number of early injuries decreases rapidly as the warning time increases to 2 hours. However, the number of early fatalities and the early fatality risk increase as the warning time increases

  7. Generalised structures for N=1 AdS backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut für Theoretische Physik & Center for Quantum Engineering and Spacetime Research,Leibniz Universität Hannover,Appelstraße 2, 30167 Hannover (Germany); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS, Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-16

    We expand upon a claim made in a recent paper [http://arxiv.org/abs/1411.5721] that generic minimally supersymmetric AdS backgrounds of warped flux compactifications of Type II and M theory can be understood as satisfying a straightforward weak integrability condition in the language of E{sub d(d)}×ℝ{sup +} generalised geometry. Namely, they are spaces admitting a generalised G-structure set by the Killing spinor and with constant singlet generalised intrinsic torsion.

  8. Hanford tank residual waste - Contaminant source terms and release models

    International Nuclear Information System (INIS)

    Deutsch, William J.; Cantrell, Kirk J.; Krupka, Kenneth M.; Lindberg, Michael L.; Jeffery Serne, R.

    2011-01-01

    Highlights: → Residual waste from five Hanford spent fuel process storage tanks was evaluated. → Gibbsite is a common mineral in tanks with high Al concentrations. → Non-crystalline U-Na-C-O-P ± H phases are common in the U-rich residual. → Iron oxides/hydroxides have been identified in all residual waste samples. → Uranium release is highly dependent on waste and leachant compositions. - Abstract: Residual waste is expected to be left in 177 underground storage tanks after closure at the US Department of Energy's Hanford Site in Washington State, USA. In the long term, the residual wastes may represent a potential source of contamination to the subsurface environment. Residual materials that cannot be completely removed during the tank closure process are being studied to identify and characterize the solid phases and estimate the release of contaminants from these solids to water that might enter the closed tanks in the future. As of the end of 2009, residual waste from five tanks has been evaluated. Residual wastes from adjacent tanks C-202 and C-203 have high U concentrations of 24 and 59 wt.%, respectively, while residual wastes from nearby tanks C-103 and C-106 have low U concentrations of 0.4 and 0.03 wt.%, respectively. Aluminum concentrations are high (8.2-29.1 wt.%) in some tanks (C-103, C-106, and S-112) and relatively low ( 2 -saturated solution, or a CaCO 3 -saturated water. Uranium release concentrations are highly dependent on waste and leachant compositions with dissolved U concentrations one or two orders of magnitude higher in the tests with high U residual wastes, and also higher when leached with the CaCO 3 -saturated solution than with the Ca(OH) 2 -saturated solution. Technetium leachability is not as strongly dependent on the concentration of Tc in the waste, and it appears to be slightly more leachable by the Ca(OH) 2 -saturated solution than by the CaCO 3 -saturated solution. In general, Tc is much less leachable (<10 wt.% of the

  9. Asymptotic Behaviour of Total Generalised Variation

    KAUST Repository

    Papafitsoros, Konstantinos

    2015-01-01

    © Springer International Publishing Switzerland 2015. The recently introduced second order total generalised variation functional TGV2 β,α has been a successful regulariser for image processing purposes. Its definition involves two positive parameters α and β whose values determine the amount and the quality of the regularisation. In this paper we report on the behaviour of TGV2 β,α in the cases where the parameters α, β as well as their ratio β/α becomes very large or very small. Among others, we prove that for sufficiently symmetric two dimensional data and large ratio β/α, TGV2 β,α regularisation coincides with total variation (TV) regularization

  10. Acute generalised exanthematous pustulosis: An update

    Directory of Open Access Journals (Sweden)

    Abhishek De

    2018-01-01

    Full Text Available Acute generalised exanthematous pustulosis (AGEP is a severe cutaneous adverse reaction and is attributed to drugs in more than 90% of cases. It is a rare disease, with an estimated incidence of 1–5 patients per million per year. The clinical manifestations characterised by the rapid development of sterile pustular lesions, fever and leucocytosis. Number of drugs has been reported to be associated with AGEP, most common being the antibiotics. Histopathologically there is intraepidermal pustules and papillary dermal oedema with neutrophilic and eosinophilic infiltrations. Systemic involvement can be present in more severe cases. Early diagnosis with withdrawal of the causative drug is the most important step in the management. Treatment includes supportive care, prevention of antibiotics and use of a potent topical steroid.

  11. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  12. A generalised groundwater flow equation using the concept of non ...

    African Journals Online (AJOL)

    head. This generalised law and the law of conservation of mass are then used to derive a new equation for groundwater flow. Numerical solutions of this equation for various fractional orders of the derivatives are compared with experimental data and the Barker generalised radial flow model for which a fractal dimension for ...

  13. A study of the one dimensional total generalised variation regularisation problem

    KAUST Repository

    Papafitsoros, Konstantinos

    2015-03-01

    © 2015 American Institute of Mathematical Sciences. In this paper we study the one dimensional second order total generalised variation regularisation (TGV) problem with L2 data fitting term. We examine the properties of this model and we calculate exact solutions using simple piecewise affine functions as data terms. We investigate how these solutions behave with respect to the TGV parameters and we verify our results using numerical experiments.

  14. Source term analysis for a RCRA mixed waste disposal facility

    International Nuclear Information System (INIS)

    Jordan, D.L.; Blandford, T.N.; MacKinnon, R.J.

    1996-01-01

    A Monte Carlo transport scheme was used to estimate the source strength resulting from potential releases from a mixed waste disposal facility. Infiltration rates were estimated using the HELP code, and transport through the facility was modeled using the DUST code, linked to a Monte Carlo driver

  15. Deformations of the generalised Picard bundle

    International Nuclear Information System (INIS)

    Biswas, I.; Brambila-Paz, L.; Newstead, P.E.

    2004-08-01

    Let X be a nonsingular algebraic curve of genus g ≥ 3, and let Mξ denote the moduli space of stable vector bundles of rank n ≥ 2 and degree d with fixed determinant ξ over X such that n and d are coprime. We assume that if g = 3 then n ≥ 4 and if g = 4 then n ≥ 3, and suppose further that n 0 , d 0 are integers such that n 0 ≥ 1 and nd 0 + n 0 d > nn 0 (2g - 2). Let E be a semistable vector bundle over X of rank n 0 and degree d 0 . The generalised Picard bundle W ξ (E) is by definition the vector bundle over M ξ defined by the direct image p M ξ *(U ξ x p X * E) where U ξ is a universal vector bundle over X x M ξ . We obtain an inversion formula allowing us to recover E from W ξ (E) and show that the space of infinitesimal deformations of W ξ (E) is isomorphic to H 1 (X, End(E)). This construction gives a locally complete family of vector bundles over M ξ parametrised by the moduli space M(n 0 ,d 0 ) of stable bundles of rank n 0 and degree d 0 over X. If (n 0 ,d 0 ) = 1 and W ξ (E) is stable for all E is an element of M(n 0 ,d 0 ), the construction determines an isomorphism from M(n 0 ,d 0 ) to a connected component M 0 of a moduli space of stable sheaves over M ξ . This applies in particular when n 0 = 1, in which case M 0 is isomorphic to the Jacobian J of X as a polarised variety. The paper as a whole is a generalisation of results of Kempf and Mukai on Picard bundles over J, and is also related to a paper of Tyurin on the geometry of moduli of vector bundles. (author)

  16. Determination of source terms in a degenerate parabolic equation

    International Nuclear Information System (INIS)

    Cannarsa, P; Tort, J; Yamamoto, M

    2010-01-01

    In this paper, we prove Lipschitz stability results for inverse source problems relative to parabolic equations. We use the method introduced by Imanuvilov and Yamamoto in 1998 based on Carleman estimates. What is new here is that we study a class of one-dimensional degenerate parabolic equations. In our model, the diffusion coefficient vanishes at one extreme point of the domain. Instead of the classical Carleman estimates obtained by Fursikov and Imanuvilov for non degenerate equations, we use and extend some recent Carleman estimates for degenerate equations obtained by Cannarsa, Martinez and Vancostenoble. Finally, we obtain Lipschitz stability results in inverse source problems for our class of degenerate parabolic equations both in the case of a boundary observation and in the case of a locally distributed observation

  17. Comparison of the longitudinal coupling impedance from different source terms

    International Nuclear Information System (INIS)

    Al-Khateeb, A.M.; Hasse, R.W.; Boine-Frankenheim, O.

    2008-01-01

    The longitudinal coupling impedance and the transmission coefficient resulting from a thin ring and from a uniform disk are obtained analytically for a resistive cylindrical beam-pipe of finite wall thickness. The impedances are derived and then compared with the well-known corresponding expression for perturbations on a uniform, coasting beam [A. Al-Khateeb, O. Boine-Frankenheim, R.W. Hasse, I. Hofmann, Phys. Rev. E 71 (2005) 026501]. The transmission coefficients from both sources are found to be exactly the same. Differences do appear in the expressions for the electromagnetic fields within the beam region, and therefore leading to different coupling impedances. By applying the results to parameters relevant for the SIS-18 synchrotron at GSI, it is found that the formula from the ring source underestimates the space-charge impedance at all beam energies and it shows a noticeable deviation from the disk formula for all frequencies. Although their mathematical expressions are different, resistive-wall impedances from the two sources are found to be numerically equal. The space-charge impedances become equal asymptotically only in the so-called ultra-relativistic limit

  18. Generalised derived limits for radioisotopes of plutonium

    International Nuclear Information System (INIS)

    Simmonds, J.R.; Harrison, N.T.; Linsley, G.S.

    1982-01-01

    Generalised Derived Limits (GDLs) are evaluated for plutonium isotopes in materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are calculated using assumptions concerning the habits and location of the critical group of exposed individuals in the population. They are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for plutonium radionuclides in air, water, soil, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for plutonium radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)

  19. Generalised derived limits for radioisotopes of iodine

    International Nuclear Information System (INIS)

    Hughes, J.S.; Haywood, S.M.; Simmonds, J.R.

    1984-04-01

    Generalised Derived Limits (GDLs) are evaluated for iodine-125,129,131,132,133,134,135 in selected materials from the terrestrial and aquatic environments and for discharge to atmosphere. They are intended for use as convenient reference levels against which the results of environmental monitoring can be compared and atmospheric discharges assessed. GDLs are intended for use when the environmental contamination or discharge to atmosphere is less than about 5% of the GDL. If the level of environmental contamination or discharge to the atmosphere exceeds this percentage of the GDL it does not necessarily mean that the dose equivalents to members of the public are approaching the dose equivalent limit. It is rather an indication that it may be appropriate to obtain a more specific derived limit for the particular situation by reviewing the values of the parameters involved in the calculation. GDL values are specified for iodine radionuclides in water, soil, grass, sediments and various foodstuffs derived from the terrestrial and aquatic environments. GDLs are also given for iodine radionuclides on terrestrial surfaces and for their discharge to atmosphere. (author)

  20. CHALLENGES IN SOURCE TERM MODELING OF DECONTAMINATION AND DECOMMISSIONING WASTES.

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN, T.M.

    2006-08-01

    Development of real-time predictive modeling to identify the dispersion and/or source(s) of airborne weapons of mass destruction including chemical, biological, radiological, and nuclear material in urban environments is needed to improve response to potential releases of these materials via either terrorist or accidental means. These models will also prove useful in defining airborne pollution dispersion in urban environments for pollution management/abatement programs. Predicting gas flow in an urban setting on a scale of less than a few kilometers is a complicated and challenging task due to the irregular flow paths that occur along streets and alleys and around buildings of different sizes and shapes, i.e., ''urban canyons''. In addition, air exchange between the outside and buildings and subway areas further complicate the situation. Transport models that are used to predict dispersion of WMD/CBRN materials or to back track the source of the release require high-density data and need defensible parameterizations of urban processes. Errors in the data or any of the parameter inputs or assumptions will lead to misidentification of the airborne spread or source release location(s). The need for these models to provide output in a real-time fashion if they are to be useful for emergency response provides another challenge. To improve the ability of New York City's (NYC's) emergency management teams and first response personnel to protect the public during releases of hazardous materials, the New York City Urban Dispersion Program (UDP) has been initiated. This is a four year research program being conducted from 2004 through 2007. This paper will discuss ground level and subway Perfluorocarbon tracer (PFT) release studies conducted in New York City. The studies released multiple tracers to study ground level and vertical transport of contaminants. This paper will discuss the results from these tests and how these results can be used

  1. Generalised Brown Clustering and Roll-up Feature Generation

    DEFF Research Database (Denmark)

    Derczynski, Leon; Chester, Sean

    2016-01-01

    Brown clustering is an established technique, used in hundreds of computational linguistics papers each year, to group word types that have similar distributional information. It is unsupervised and can be used to create powerful word representations for machine learning. Despite its improbable...... active set size. Moreover, the generalisation permits a novel approach to feature selection from Brown clusters: We show that the standard approach of shearing the Brown clustering output tree at arbitrary bitlengths is lossy and that features should be chosen instead by rolling up Generalised Brown...... hierarchies. The generalisation and corresponding feature generation is more principled, challenging the way Brown clustering is currently understood and applied....

  2. Conditioning of disused sealed sources in countries without disposal facility: Short term gain - long term pain

    International Nuclear Information System (INIS)

    Benitez-Navarro, J.C.; Salgado-Mojena, M.

    2002-01-01

    Owing to the considerable development in managing disused sealed radioactive sources (DSRS), the limited availability of disposal practices for them, and the new recommendations for the use of borehole disposal concept, it was felt that a paper reviewing the existing recommendations could be a starting point of discussion on the retrievability of the sources. Even when no international consensus exists as to an acceptable solution for the challenge of disposal of disused sealed sources, the 'Best Available Technology' for managing most of them, recommended for developing countries, included the cementation of the sources. The waste packages prepared in such a way do not allow any flexibility to accommodate possible future disposal requirements. Therefore, the 'Wait and See' approach could be also recommended for managing not only the sources with long-live radionuclides and high activity, but probably for all kind of existing disused sealed sources. The general aim of the current paper is to identify and review the current recommendations for managing disused sealed sources and to meditate on the most convenient management schemes for disused sealed radioactive sources in Member States without disposal capacities (Latin America, Africa). The risk that cemented DSRS could be incompatible with future disposal requirements was taken into account. (author)

  3. Energy sources policies in terms of environment [Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Ozkan, Safak Gokhan [Etibank Research Centre, Izmir (Turkey); Tuncer, Gungor; Ipekoglu, Bedri [Istanbul Univ., Mining Engineering Dept., Istanbul (Turkey)

    1998-09-01

    The energy sources available in Turkey are reviewed. Currently most of the country`s power is generated in hydroelectric and coal fired power plants, but this is not adequate to meet energy demand. Indigenous oil and gas deposits are small on a global scale but Turkey has 54% of the world`s thorium deposits. Nuclear technology is only at the research stage at present however so that nuclear power is not an immediate possibility. The present reliance on hydro and coal will continue therefore with efforts to enhance petroleum and natural gas production and to develop all other existing resources such as solar, geothermal and biomass where some potential exists. In addition, the need to promote energy conservation and energy efficiency is seen as a priority. (UK)

  4. Source terms derived from analyses of hypothetical accidents, 1950-1986

    International Nuclear Information System (INIS)

    Stratton, W.R.

    1987-01-01

    This paper reviews the history of reactor accident source term assumptions. After the Three Mile Island accident, a number of theoretical and experimental studies re-examined possible accident sequences and source terms. Some of these results are summarized in this paper

  5. Considerations about the implementation of alternative source terms: nuclear safety and plant operational benefits

    International Nuclear Information System (INIS)

    Rodriguez H, A.; Barcenas R, M.; Ortiz V, J.

    2010-10-01

    In this paper, several aspects are discussed about the implementation of an alternative source term for the analysis of the radiological consequences of design basis accidents in nuclear power plants. First, the rationale for implementation of an alternative source term is discussed. Then, the topics studied start by considering the current methodology and regulation applied to determine the original source term. Next, to determine a different source term, the basis of a new methodology is discussed, as, for example the elimination of excessive conservative assumptions. As a consequence of the adoption of an alternative source term, operational benefits are expected from relaxation of regulatory requirements established in the plant technical specifications. Other key issues considered in this work are the use of engineered safety features to minimize the iodine release during an accident, and technical requirements regarding the safe operation of the emergency filtering system for the main control room, in order to protect the reactor operation personnel. Finally, a discussion is presented about the impact on risk assessment, when using an alternative source term, and remarking that the adoption of a new source term by itself do not have and impact on plant risk, but it does have an effect on radiological consequences. Nevertheless, a detailed review of technical specification changes that could induce some risk should be considered. As conclusions of this work, recommendations are presented for the licensing process of an alternative source term. (Author)

  6. Source term estimation during incident response to severe nuclear power plant accidents. Draft

    International Nuclear Information System (INIS)

    McKenna, T.J.; Giitter, J.

    1987-01-01

    The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. The goal is to present a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. (author)

  7. Source term estimation during incident response to severe nuclear power plant accidents

    International Nuclear Information System (INIS)

    McKenna, T.J.; Glitter, J.G.

    1988-10-01

    This document presents a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. 39 refs., 48 figs., 19 tabs

  8. Prospects of renewable energy sources in India: Prioritization of alternative sources in terms of Energy Index

    International Nuclear Information System (INIS)

    Jha, Shibani K.; Puppala, Harish

    2017-01-01

    The growing energy demand in progressing civilization governs the exploitation of various renewable sources over the conventional sources. Wind, Solar, Hydro, Biomass, and waste & Bagasse are the various available renewable sources in India. A reliable nonconventional geothermal source is also available in India but it is restricted to direct heat applications. This study archives the status of renewable alternatives in India. The techno economic factors and environmental aspects associated with each of these alternatives are discussed. This study focusses on prioritizing the renewable sources based on a parameter introduced as Energy Index. This index is evaluated using cumulative scores obtained for each of the alternatives. The cumulative score is obtained by evaluating each alternative over a range of eleven environmental and techno economic criteria following Fuzzy Analytical Hierarchy Process. The eleven criteria's considered in the study are Carbon dioxide emissions (CO 2 ), Sulphur dioxide emissions (SO 2 ), Nitrogen oxide emissions (NO x ), Land requirement, Current energy cost, Potential future energy cost, Turnkey investment, Capacity factor, Energy efficiency, Design period and Water consumption. It is concluded from the study that the geothermal source is the most preferable alternative with highest Energy Index. Hydro, Wind, Biomass and Solar sources are subsequently preferred alternatives. - Highlights: • FAH process is used to obtain cumulative score for each renewable alternative. • Cumulative score is normalized by highest score of ideal source. • Energy Index shows how best a renewable alternative is. • Priority order is obtained for alternatives based on Energy Index. • Geothermal is most preferable source followed by Hydro, Wind, Biomass and Solar.

  9. Generalised zeta-function regularization for scalar one-loop effective action

    OpenAIRE

    Cognola, Guido; Zerbini, Sergio

    2004-01-01

    The one-loop effective action for a scalar field defined in the ultrastatic space-time where non standard logarithmic terms in the asymptotic heat-kernel expansion are present, is investigated by a generalisation of zeta-function regularisation. It is shown that additional divergences may appear at one-loop level. The one-loop renormalisability of the model is discussed and the one-loop renormalisation group equations are derived.

  10. An environmental generalised Luenberger-Hicks-Moorsteen productivity indicator and an environmental generalised Hicks-Moorsteen productivity index.

    Science.gov (United States)

    Abad, A

    2015-09-15

    The purpose of this paper is to introduce an environmental generalised productivity indicator and its ratio-based counterpart. The innovative environmental generalised total factor productivity measures inherit the basic structure of both Hicks-Moorsteen productivity index and Luenberger-Hicks-Moorsteen productivity indicator. This methodological contribution shows that these new environmental generalised total factor productivity measures yield the earlier standard Hicks-Moorsteen index and Luenberger-Hicks-Moorsteen indicator, as well as environmental performance index, as special cases. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Supersymmetric backgrounds, the Killing superalgebra, and generalised special holonomy

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, André [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Strickland-Constable, Charles [Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France)

    2016-11-10

    We prove that, for M theory or type II, generic Minkowski flux backgrounds preserving N supersymmetries in dimensions D≥4 correspond precisely to integrable generalised G{sub N} structures, where G{sub N} is the generalised structure group defined by the Killing spinors. In other words, they are the analogues of special holonomy manifolds in E{sub d(d)}×ℝ{sup +} generalised geometry. In establishing this result, we introduce the Kosmann-Dorfman bracket, a generalisation of Kosmann’s Lie derivative of spinors. This allows us to write down the internal sector of the Killing superalgebra, which takes a rather simple form and whose closure is the key step in proving the main result. In addition, we find that the eleven-dimensional Killing superalgebra of these backgrounds is necessarily the supertranslational part of the N-extended super-Poincaré algebra.

  12. Exceptional generalised geometry for massive IIA and consistent reductions

    Energy Technology Data Exchange (ETDEWEB)

    Cassani, Davide; Felice, Oscar de; Petrini, Michela [LPTHE, Sorbonne Universités UPMC Paris 06, CNRS,4 place Jussieu, F-75005, Paris (France); Strickland-Constable, Charles [Institut de physique théorique, Université Paris Saclay, CEA, CNRS,Orme des Merisiers, F-91191 Gif-sur-Yvette (France); Institut des Hautes Études Scientifiques, Le Bois-Marie,35 route de Chartres, F-91440 Bures-sur-Yvette (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2016-08-10

    We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S{sup 6}, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7−p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S{sup d}, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.

  13. Generalised Scherk-Schwarz reductions from gauged supergravity

    Science.gov (United States)

    Inverso, Gianluca

    2017-12-01

    A procedure is described to construct generalised Scherk-Schwarz uplifts of gauged supergravities. The internal manifold, fluxes, and consistent truncation Ansatz are all derived from the embedding tensor of the lower-dimensional theory. We first describe the procedure to construct generalised Leibniz parallelisable spaces where the vector components of the frame are embedded in the adjoint representation of the gauge group, as specified by the embedding tensor. This allows us to recover the generalised Scherk-Schwarz reductions known in the literature and to prove a no-go result for the uplift of ω-deformed SO( p, q) gauged maximal supergravities. We then extend the construction to arbitrary generalised Leibniz parallelisable spaces, which turn out to be torus fibrations over manifolds in the class above.

  14. Rational first integrals of geodesic equations and generalised hidden symmetries

    International Nuclear Information System (INIS)

    Aoki, Arata; Houri, Tsuyoshi; Tomoda, Kentaro

    2016-01-01

    We discuss novel generalisations of Killing tensors, which are introduced by considering rational first integrals of geodesic equations. We introduce the notion of inconstructible generalised Killing tensors, which cannot be constructed from ordinary Killing tensors. Moreover, we introduce inconstructible rational first integrals, which are constructed from inconstructible generalised Killing tensors, and provide a method for checking the inconstructibility of a rational first integral. Using the method, we show that the rational first integral of the Collinson–O’Donnell solution is not inconstructible. We also provide several examples of metrics admitting an inconstructible rational first integral in two and four-dimensions, by using the Maciejewski–Przybylska system. Furthermore, we attempt to generalise other hidden symmetries such as Killing–Yano tensors. (paper)

  15. Towards a 'pointless' generalisation of Yang-Mills theory

    International Nuclear Information System (INIS)

    Chan Hongmo; Tsou Sheungtsun

    1989-05-01

    We examine some generalisations in physical concepts of gauge theories, leading towards a scenario corresponding to non-commutative geometry, where the concept of locality loses its usual meaning of being associated with points on a base manifold and becomes intertwined with the concept of internal symmetry, suggesting thereby a gauge theory of extended objects. Examples are given where such generalised gauge structures can be realised, in particular that of string theory. (author)

  16. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  17. Recent advances in the source term area within the SARNET European severe accident research network

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E., E-mail: luisen.herranz@ciemat.es [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica, CIEMAT, Avda. Complutense 40, E-28040 Madrid (Spain); Haste, T. [Institut de Radioprotection et de Sûreté Nucléaire, IRSN, BP 3, F-13115 St Paul lez Durance Cedex (France); Kärkelä, T. [VTT Technical Research Centre of Finland, P.O. Box 1000, FI-02044 VTT Espoo (Finland)

    2015-07-15

    Highlights: • Main achievements of source term research in SARNET are given. • Emphasis on the radiologically important iodine and ruthenium fission products. • Conclusions on FP release, transport in the RCS and containment behaviour. • Significance of large-scale integral experiments to validate the analyses used. • A thorough list of the most recent references on source term research results. - Abstract: Source Term has been one of the main research areas addressed within the SARNET network during the 7th EC Framework Programme of EURATOM. The entire source term domain was split into three major areas: oxidising impact on source term, iodine chemistry in the reactor coolant system and containment and data and code assessment. The present paper synthesises the main technical outcome stemming from the SARNET FWP7 project in the area of source term and includes an extensive list of references in which deeper insights on specific issues may be found. Besides, based on the analysis of the current state of the art, an outlook of future source term research is outlined, where major changes in research environment are discussed (i.e., the end of the Phébus FP project; the end of the SARNET projects; and the launch of HORIZON 2020). Most probably research projects will be streamlined towards: release and transport under oxidising conditions, containment chemistry, existing and innovative filtered venting systems and others. These will be in addition to a number of projects that have been completed or are ongoing under different national and international frameworks, like VERDON, CHIP and EPICUR started under the International Source Term Programme (ISTP), the OECD/CSNI programmes BIP, BIP2, STEM, THAI and THAI2, and the French national programme MIRE. The experimental PASSAM project under the 7th EC Framework programme, focused on source term mitigation systems, is highlighted as a good example of a project addressing potential enhancement of safety systems

  18. Development of source term evaluation method for Korean Next Generation Reactor(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Geon Jae; Park, Jin Baek; Lee, Yeong Il; Song, Min Cheonl; Lee, Ho Jin [Korea Advanced Institue of Science and Technology, Taejon (Korea, Republic of)

    1998-06-15

    This project had investigated irradiation characteristics of MOX fuel method to predict nuclide concentration at primary and secondary coolant using a core containing 100% of all MOX fuel and development of source term evaluation tool. In this study, several prediction methods of source term are evaluated. Detailed contents of this project are : an evaluation of model for nuclear concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant using purely MOX fuel, suggestion of source term prediction method of NPP with a core using MOX fuel.

  19. Source-term reevaluation for US commercial nuclear power reactors: a status report

    International Nuclear Information System (INIS)

    Herzenberg, C.L.; Ball, J.R.; Ramaswami, D.

    1984-12-01

    Only results that had been discussed publicly, had been published in the open literature, or were available in preliminary reports as of September 30, 1984, are included here. More than 20 organizations are participating in source-term programs, which have been undertaken to examine severe accident phenomena in light-water power reactors (including the chemical and physical behavior of fission products under accident conditions), update and reevaluate source terms, and resolve differences between predictions and observations of radiation releases and related phenomena. Results from these source-term activities have been documented in over 100 publications to date

  20. Source term model evaluations for the low-level waste facility performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S.; Su, S.I. [North Carolina State Univ., Raleigh, NC (United States)

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  1. Estimation of Source Term Behaviors in SBO Sequence in a Typical 1000MWth PWR and Comparison with Other Source Term Results

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Han, Seok Jung; Ahn, Kwang Il; Fynan, Douglas; Jung, Yong Hoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Since the Three Mile Island (TMI) (1979), Chernobyl (1986), Fukushima Daiichi (2011) accidents, the assessment of radiological source term effects on the environment has been a key concern of nuclear safety. In the Fukushima Daiichi accident, the long-term SBO (station blackout) accident occurs. Using the worst case assumptions like in Fukushima accident on the accident sequences and on the availability of safety systems, the thermal hydraulic behaviors, core relocation and environmental source terms behaviors are estimated for long-term SBO accident for OPR-1000 reactor. MELCOR code version 1.8.6 is used in this analysis. Source term results estimated in this study is compared with other previous studies and estimated results in Fukushima accidents in UNSCEAR-2013 report. This study estimated that 11 % of iodine can be released to environment and 2% of cesium can be released to environment. UNSCEAR-2013 report estimated that 2 - 8 % of iodine have been released to environment and 1 - 3 % of cesium have been released to the environment. They have similar results in the aspect of release fractions of iodine and cesium to environment.

  2. Accident source terms for boiling water reactors with high burnup cores.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Powers, Dana Auburn; Leonard, Mark Thomas

    2007-11-01

    The primary objective of this report is to provide the technical basis for development of recommendations for updates to the NUREG-1465 Source Term for BWRs that will extend its applicability to accidents involving high burnup (HBU) cores. However, a secondary objective is to re-examine the fundamental characteristics of the prescription for fission product release to containment described by NUREG-1465. This secondary objective is motivated by an interest to understand the extent to which research into the release and behaviors of radionuclides under accident conditions has altered best-estimate calculations of the integral response of BWRs to severe core damage sequences and the resulting radiological source terms to containment. This report, therefore, documents specific results of fission product source term analyses that will form the basis for the HBU supplement to NUREG-1465. However, commentary is also provided on observed differences between the composite results of the source term calculations performed here and those reflected NUREG-1465 itself.

  3. Relation between source term and emergency planning for nuclear power plants

    International Nuclear Information System (INIS)

    Shi Zhongqi; Yang Ling

    1992-01-01

    Some background information of the severe accidents and source terms related to the nuclear power plant emergency planning are presented. The new source term information in NUREG-0956 and NUREG-1150, and possible changes in emergency planning requirements in U.S.A. are briefly provided. It is suggested that a principle is used in selecting source terms for establishing the emergency planning policy and a method is used in determining the Emergency Planning Zone (EPZ) size in China. Based on the research results of (1) EPZ size of PWR nuclear power plants being built in China, and (2) impact of reactor size and selected source terms on the EPZ size, it is concluded that the suggested principle and the method are suitable and feasible for PWR nuclear power plants in China

  4. Selected source term topics. Report to CSNI by an OECD/NEA Group of experts

    International Nuclear Information System (INIS)

    1987-04-01

    CSNI Report 136 summarizes the results of the work performed by the Group of Experts on the Source Term and Environmental Consequences (PWG4) during the period extending from 1983 and 1986. This report is complementary to Part 1, 'Technical Status of the Source Term' of CSNI Report 135, 'Report to CSNI on Source Term Assessment, Containment atmosphere control systems, and accident consequences'; it considers in detail a number of very specific issues thought to be important in the source term area. It consists of: an executive summary (prepared by the Chairman of the Group), a section on conclusions and recommendations, and five technical chapters (fission product chemistry in the primary circuit of a LWR during severe accidents; resuspension/re-entrainment of aerosols in LWRs following a meltdown accident; iodine chemistry under severe accident conditions; effects of combustion, steam explosions and pressurized melt ejection on fission product behaviour; radionuclide removal by pool scrubbing), a technical annex and two appendices

  5. Consideration of emergency source terms for pebble-bed high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Tao, Liu; Jun, Zhao; Jiejuan, Tong; Jianzhu, Cao

    2009-01-01

    Being the last barrier in the nuclear power plant defense-in-depth strategy, emergency planning (EP) is an integrated project. One of the key elements in this process is emergency source terms selection. Emergency Source terms for light water reactor (LWR) nuclear power plant (NPP) have been introduced in many technical documents, and advanced NPP emergency planning is attracting attention recently. Commercial practices of advanced NPP are undergoing in the world, pebble-bed high-temperature gas-cooled reactor (HTGR) power plant is under construction in China which is considered as a representative of advanced NPP. The paper tries to find some pieces of suggestion from our investigation. The discussion of advanced NPP EP will be summarized first, and then the characteristics of pebble-bed HTGR relating to EP will be described. Finally, PSA insights on emergency source terms selection and current pebble-bed HTGR emergency source terms suggestions are proposed

  6. Reassessment of the technical bases for estimating source terms. Draft report for comment

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Pasedag, W.F.; Ryder, C.P.; Peabody, C.A.; Jankowski, M.W.

    1985-07-01

    NUREG-0956 describes the NRC staff and contractor efforts to reassess and update the agency's analytical procedures for estimating accident source terms for nuclear power plants. The effort included development of a new source term analytical procedure - a set of computer codes - that is intended to replace the methodology of the Reactor Safety Study (WASH-1400) and to be used in reassessing the use of TID-14844 assumptions (10 CFR 100). NUREG-0956 describes the development of these codes, the demonstration of the codes to calculate source terms for specific cases, the peer review of this work, some perspectives on the overall impact of new source terms on plant risks, the plans for related research projects, and the conclusions and recommendations resulting from the effort

  7. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  8. Impact of source terms on distances to which reactor accident consequences occur

    International Nuclear Information System (INIS)

    Ostmeyer, R.M.

    1982-01-01

    Estimates of the distances over which reactor accident consequences might occur are important for development of siting criteria and for emergency response planning. This paper summarizes the results of a series of CRAC2 calculations performed to estimate these distances. Because of the current controversy concerning the magnitude of source terms for severe accidents, the impact of source term reductions upon distance estimates is also examined

  9. libmpdata++ 0.1: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2014-11-01

    This paper accompanies first release of libmpdata++, a C++ library implementing the Multidimensional Positive-Definite Advection Transport Algorithm (MPDATA). The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include: homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  10. libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations

    Science.gov (United States)

    Jaruga, A.; Arabas, S.; Jarecka, D.; Pawlowska, H.; Smolarkiewicz, P. K.; Waruszewski, M.

    2015-04-01

    This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA) on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case); and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  11. libmpdata++ 1.0: a library of parallel MPDATA solvers for systems of generalised transport equations

    Directory of Open Access Journals (Sweden)

    A. Jaruga

    2015-04-01

    Full Text Available This paper accompanies the first release of libmpdata++, a C++ library implementing the multi-dimensional positive-definite advection transport algorithm (MPDATA on regular structured grid. The library offers basic numerical solvers for systems of generalised transport equations. The solvers are forward-in-time, conservative and non-linearly stable. The libmpdata++ library covers the basic second-order-accurate formulation of MPDATA, its third-order variant, the infinite-gauge option for variable-sign fields and a flux-corrected transport extension to guarantee non-oscillatory solutions. The library is equipped with a non-symmetric variational elliptic solver for implicit evaluation of pressure gradient terms. All solvers offer parallelisation through domain decomposition using shared-memory parallelisation. The paper describes the library programming interface, and serves as a user guide. Supported options are illustrated with benchmarks discussed in the MPDATA literature. Benchmark descriptions include code snippets as well as quantitative representations of simulation results. Examples of applications include homogeneous transport in one, two and three dimensions in Cartesian and spherical domains; a shallow-water system compared with analytical solution (originally derived for a 2-D case; and a buoyant convection problem in an incompressible Boussinesq fluid with interfacial instability. All the examples are implemented out of the library tree. Regardless of the differences in the problem dimensionality, right-hand-side terms, boundary conditions and parallelisation approach, all the examples use the same unmodified library, which is a key goal of libmpdata++ design. The design, based on the principle of separation of concerns, prioritises the user and developer productivity. The libmpdata++ library is implemented in C++, making use of the Blitz++ multi-dimensional array containers, and is released as free/libre and open-source software.

  12. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    Science.gov (United States)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  13. Backup Sourcing Decisions for Coping with Supply Disruptions under Long-Term Horizons

    Directory of Open Access Journals (Sweden)

    Jing Hou

    2016-01-01

    Full Text Available This paper studies a buyer’s inventory control problem under a long-term horizon. The buyer has one major supplier that is prone to disruption risks and one backup supplier with higher wholesale price. Two kinds of sourcing methods are available for the buyer: single sourcing with/without contingent supply and dual sourcing. In contingent sourcing, the backup supplier is capacitated and/or has yield uncertainty, whereas in dual sourcing the backup supplier has an incentive to offer output flexibility during disrupted periods. The buyer’s expected cost functions and the optimal base-stock levels using each sourcing method under long-term horizon are obtained, respectively. The effects of three risk parameters, disruption probability, contingent capacity or uncertainty, and backup flexibility, are examined using comparative studies and numerical computations. Four sourcing methods, namely, single sourcing with contingent supply, dual sourcing, and single sourcing from either of the two suppliers, are also compared. These findings can be used as a valuable guideline for companies to select an appropriate sourcing strategy under supply disruption risks.

  14. Reducing the generalised Sudoku problem to the Hamiltonian cycle problem

    Directory of Open Access Journals (Sweden)

    Michael Haythorpe

    2016-12-01

    Full Text Available The generalised Sudoku problem with N symbols is known to be NP-complete, and hence is equivalent to any other NP-complete problem, even for the standard restricted version where N is a perfect square. In particular, generalised Sudoku is equivalent to the, classical, Hamiltonian cycle problem. A constructive algorithm is given that reduces generalised Sudoku to the Hamiltonian cycle problem, where the resultant instance of Hamiltonian cycle problem is sparse, and has O(N3 vertices. The Hamiltonian cycle problem instance so constructed is a directed graph, and so a (known conversion to undirected Hamiltonian cycle problem is also provided so that it can be submitted to the best heuristics. A simple algorithm for obtaining the valid Sudoku solution from the Hamiltonian cycle is provided. Techniques to reduce the size of the resultant graph are also discussed.

  15. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  16. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  17. Source term determination from subcritical multiplication measurements at Koral-1 reactor

    International Nuclear Information System (INIS)

    Blazquez, J.B.; Barrado, J.M.

    1978-01-01

    By using an AmBe neutron source two independent procedures have been settled for the zero-power experimental fast-reactor Coral-1 in order to measure the source term which appears in the point kinetical equations. In the first one, the source term is measured when the reactor is just critical with source by taking advantage of the wide range of the linear approach to critical for Coral-1. In the second one, the measurement is made in subcritical state by making use of the previous calibrated control rods. Several applications are also included such as the measurement of the detector dead time, the determinations of the reactivity of small samples and the shape of the neutron importance of the source. (author)

  18. Thoracic involvement in generalised lymphatic anomaly (or lymphangiomatosis

    Directory of Open Access Journals (Sweden)

    Francesca Luisi

    2016-06-01

    Full Text Available Generalised lymphatic anomaly (GLA, also known as lymphangiomatosis, is a rare disease caused by congenital abnormalities of lymphatic development. It usually presents in childhood but can also be diagnosed in adults. GLA encompasses a wide spectrum of clinical manifestations ranging from single-organ involvement to generalised disease. Given the rarity of the disease, most of the information regarding it comes from case reports. To date, no clinical trials concerning treatment are available. This review focuses on thoracic GLA and summarises possible diagnostic and therapeutic approaches.

  19. The oculocerebral syndrome in association with generalised ...

    African Journals Online (AJOL)

    The pregnancy with the proband was uncomplicated. De- livery was at term and was normal; birth weight was 2500 g. There was physiological jaundice requiring no therapy. Psychomotor development was markedly delayed. She sat unaided at 9 months and walked at 2 years. Single words were uttered at around 6 years ...

  20. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  1. Review of radionuclide source terms used for performance-assessment analyses

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1993-06-01

    Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ''uranium'' and 4n+3 ''actinium'' decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides

  2. Outlook of Source Term Research: A Critical View from the Latest Results

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E.; Haste, T.; Kärkelä, T.

    2015-07-01

    Research on source term has been ongoing for several years all over the world. This paper shortly synthesizes the recent main outcomes from source term research. It highlights knowledge gaps remaining and discusses ways to proceed to address those considered high priority. Fission product release under oxidizing conditions and/or from fuel configurations other than rod-like MOX are highlighted, particularly for MOX fuel. Transport through the circuit needs further research on high temperature chemistry and on processes resulting in a late in-containment source term, like revaporization. Needs regarding containment issues seem to be well addressed in forthcoming projects under the OECD framework (i.e., BIP3, THAI3 and STEM2) and the only open issue left out would be pool scrubbing, which is being partially tackled within the EC-PASSAM project. A similar situation to the containment exists in the source term mitigation area, in which the PASSAM and the French national MIRE projects are underway. Aside from further knowledgedriven research, there is a consensus on the need to assess the source term predictability of current system codes and to build up an international experimental platform that contributes to keep the current research capability. (Author)

  3. Long-term Lightcurves of M31 X-ray Sources

    Science.gov (United States)

    Kong, A. K. H.; Garcia, M. R.; Di Stefano, R.; Murray, S. S.; Primini, F. A.

    2001-09-01

    M31 has been monitored by a Chandra GTO program, which used both the HRC and ACIS from 1999 to 2001 and a GO program, which used ACIS-S during 2000-2001. We report here the lightcurves of X-ray sources in M31 during the past two years. X-ray sources in M31, like in our Galaxy, exhibit variability on long timescales (days to months). Some sources also show spectral state transitions, analogous to the soft/hard state of Galactic sources (like Cyg X-1 and GX 339-4). We present a statistical overview and also focus on some particularly interesting classes of sources. Luminous (> 1038 ergs s-1) X-ray sources in M31 are often associated with globular clusters (Di Stefano et al. 2001); some of these luminous sources are variables, and the lightcurves provide clues to the nature of the compact object. We present data on a few intriguing globular cluster sources. We also present the spectral and time evolution of the X-ray transient near the nucleus (CXOGMP J004242.0+411608; Garcia et al. 2000), which has finally turned off after more than one year in outburst. We compare the long-term X-ray lightcurves of M31 sources to those of Galactic sources (obtained by RXTE/ASM). Such a comparison allows us to probe the nature of M31 X-ray sources.

  4. Finite Element Solutions for the Space Fractional Diffusion Equation with a Nonlinear Source Term

    Directory of Open Access Journals (Sweden)

    Y. J. Choi

    2012-01-01

    Full Text Available We consider finite element Galerkin solutions for the space fractional diffusion equation with a nonlinear source term. Existence, stability, and order of convergence of approximate solutions for the backward Euler fully discrete scheme have been discussed as well as for the semidiscrete scheme. The analytical convergent orders are obtained as O(k+hγ˜, where γ˜ is a constant depending on the order of fractional derivative. Numerical computations are presented, which confirm the theoretical results when the equation has a linear source term. When the equation has a nonlinear source term, numerical results show that the diffusivity depends on the order of fractional derivative as we expect.

  5. Review of the accident source terms for aluminide fuel: Application to the BR2 reactor

    International Nuclear Information System (INIS)

    Joppen, F.

    2005-01-01

    A major safety review of the BR2, a material test reactor, is to be conducted for the year 2006. One of the subjects selected for the safety review is the definition of source terms for emergency planning and in particular the development of accident scenarios. For nuclear power plants the behaviour of fuel under accident conditions is a well studied object. In case of non-power reactors this basic knowledge is rather scarce. The usefulness of information from power plant fuels is limited due to the differences in fuel type, power level and thermohydraulical conditions. First investigation indicates that using data from power plant fuel leads to an overestimation of the source terms. Further research on this subject could be very useful for the research reactor community, in order to define more realistic source terms and to improve the emergency preparedness. (author)

  6. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  7. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  8. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    Science.gov (United States)

    Leveque, R. J.; Yee, H. C.

    1988-01-01

    The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.

  9. A note on a generalisation of Weyl's theory of gravitation

    International Nuclear Information System (INIS)

    Dereli, T.; Tucker, R.W.

    1982-01-01

    A scale-invariant gravitational theory due to Bach and Weyl is generalised by the inclusion of space-time torsion. The difference between the arbitrary and zero torsion constrained variations of the Weyl action is elucidated. Conformal rescaling properties of the gravitational fields are discussed. A new class of classical solutions with torsion is presented. (author)

  10. A ten-year histopathological study of generalised lymphadenopathy ...

    African Journals Online (AJOL)

    This study was undertaken to study the histopathology of generalised lymphadenopathy in India, as well as the demographics of the study population. Method: This study was conducted for a period of 10 years (August 1997-July 2007), of which eight years were retrospective, from August 1997-July 2005, and two years ...

  11. A ten-year histopathological study of generalised lymphadenopathy ...

    African Journals Online (AJOL)

    2010-07-31

    Jul 31, 2010 ... non-Hodgkin's lymphoma, and 18 cases (7.37%) were metastatic malignancy. Conclusion: In this study, the most common cause of generalised lymphadenopathy was granulomatous lymphadenitis, followed by reactive lymphadenitis. Among the neoplastic lesions, metastatic malignancy accounted for ...

  12. Gait analysis of adults with generalised joint hypermobility

    DEFF Research Database (Denmark)

    Simonsen, Erik B; Tegner, Heidi; Alkjær, Tine

    2012-01-01

    BACKGROUND: The majority of adults with Generalised Joint Hypermobility experience symptoms such as pain and joint instability, which is likely to influence their gait pattern. Accordingly, the purpose of the present project was to perform a biomechanical gait analysis on a group of patients...

  13. Generalisation of language and knowledge models for corpus analysis

    OpenAIRE

    Loss, Anton

    2012-01-01

    This paper takes new look on language and knowledge modelling for corpus linguistics. Using ideas of Chaitin, a line of argument is made against language/knowledge separation in Natural Language Processing. A simplistic model, that generalises approaches to language and knowledge, is proposed. One of hypothetical consequences of this model is Strong AI.

  14. Travelling wave solutions of (2 1)-dimensional generalised time ...

    Indian Academy of Sciences (India)

    Youwei Zhang

    2018-02-09

    Feb 9, 2018 ... Keywords. Time-fractional Hirota equation; fractional complex transform; complete discrimination system; tanh- expansion; travelling wave. PACS Nos 02.30.Jr; 05.45.Yv; 04.20.Jb. 1. Introduction. We consider the solution of the (2 + 1)-dimensional generalised time-fractional Hirota equation. { i∂ α t u + uxy ...

  15. Equilibrium points in the generalised photogravitational non-planar ...

    African Journals Online (AJOL)

    We generalised the photogravitational non-planar restricted three body problem by considering the smaller primary as an oblate spheroid. With both the primaries radiating, we located the equilibrium points which lie outside the orbital plane, contrary to the classical case. Besides finding the equations of motion of the ...

  16. Page 1 Compactification of generalised Jacobians 425 Next we ...

    Indian Academy of Sciences (India)

    Compactification of generalised Jacobians 425. Next we study the infinitesimal deformation of torsion-free sheaves. Let X be a projective integral Gorenstein curve (A curve X as in Propositicin III.1.7, above, is easily seen to be Gorenstein). Let F be a torsion-free coherent. Ox-Module and F., an infinitesimal deformation of F ...

  17. Adapting Metacognitive Therapy to Children with Generalised Anxiety Disorder

    DEFF Research Database (Denmark)

    Esbjørn, Barbara Hoff; Normann, Nicoline; Reinholdt-Dunne, Marie Louise

    2015-01-01

    -c) with generalised anxiety disorder (GAD) and create suggestions for an adapted manual. The adaptation was based on the structure and techniques used in MCT for adults with GAD. However, the developmental limitations of children were taken into account. For instance, therapy was aided with worksheets, practical...

  18. Young Indigenous Students en Route to Generalising Growing Patterns

    Science.gov (United States)

    Miller, Jodie

    2016-01-01

    This paper presents a hypothesised learning trajectory for a Year 3 Indigenous student en route to generalising growing patterns. The trajectory emerged from data collected across a teaching experiment (students n = 18; including a pre-test and three 45-minute mathematics lessons) and clinical interviews (n = 3). A case study of one student is…

  19. Generalised time functions and finiteness of the Lorentzian distance

    OpenAIRE

    Rennie, Adam; Whale, Ben E.

    2014-01-01

    We show that finiteness of the Lorentzian distance is equivalent to the existence of generalised time functions with gradient uniformly bounded away from light cones. To derive this result we introduce new techniques to construct and manipulate achronal sets. As a consequence of these techniques we obtain a functional description of the Lorentzian distance extending the work of Franco and Moretti.

  20. The long-term problems of contaminated land: Sources, impacts and countermeasures

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  1. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  2. The long-term problems of contaminated land: Sources, impacts and countermeasures

    International Nuclear Information System (INIS)

    Baes, C.F. III.

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'')

  3. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    Science.gov (United States)

    Leveque, R. J.; Yee, H. C.

    1990-01-01

    In the present study of the behavior of typical numerical methods in the case of a model advection equation having a parameter-dependent source term, two approaches to the incorporation of the source terms are used: MacCormack-type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. The latter are found to perform slightly better. The model scalar equation is used to show that the incorrectness of the propagation speeds of discontinuities observed in the stiff case is due to the introduction of nonequilibrium values through numerical dissipation in the advection step.

  4. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Formulation of a generalised switching CFAR with application to X-band maritime surveillance radar.

    Science.gov (United States)

    Weinberg, Graham V

    2015-01-01

    A generalisation of a switching based detector is examined, allowing the construction of such detectors for target detection in any clutter model of interest. Such detectors are important in radar signal processing because they are robust solutions to the management of interference. Although formulated in general terms, the theory is applied to the design of a switching constant false alarm rate detector for X-band maritime surveillance radar. It is shown that such a detector manages the problem of interference better than standard detection processes.

  6. FURTHER GENERALISATIONS OF THE KUMMER-SCHWARZ EQUATION: ALGEBRAIC AND SINGULARITY PROPERTIES

    Directory of Open Access Journals (Sweden)

    R Sinuvasan

    2017-12-01

    Full Text Available The Kummer–Schwarz Equation, 2y'y'''− 3(y''2 = 0, has a generalisation, (n − 1y(n−2y(n − ny(n−12 = 0, which shares many properties with the parent form in terms of symmetry and singularity. All equations of the class are integrable in closed form. Here we introduce a new class, (n+q−2y(n−2y(n −(n+q−1y(n−12 = 0, which has different integrability and singularity properties.

  7. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  8. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    Science.gov (United States)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order

  9. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    Science.gov (United States)

    Hall, Matthew L.; Bavelier, Daphne

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…

  10. Radioiodine source term and its potential impact on the use of potassium iodide

    International Nuclear Information System (INIS)

    Malinauskas, A.P.

    1982-01-01

    Information is presented concerning chemical forms of fission product iodine in the primary circuit; chemical forms of fission product iodine in the containment building; summary of iodine chemistry in light water reactor accidents; and impact of the radiodine source term on the potassium iodide issue

  11. Model description for calculating the source term of the Angra 1 environmental control system

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Amaral Neto, J.D.; Salles, M.R.

    1988-01-01

    This work presents the model used for evaluation of source term released from Angra 1 Nuclear Power Plant in case of an accident. After that, an application of the model for the case of a Fuel Assembly Drop Accident Inside the Fuel Handling Building during reactor refueling is presented. (author) [pt

  12. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction...... resulting from uncertainties in assessments of both the release of radionuclides from the accident and their dispersion. Previously, due to lack of computer power, such methods could not be applied to operational real-time decision support. However, with modern supercomputing facilities, available e...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  13. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, J.J. [Oak Ridge National Lab., TN (United States)

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  14. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  15. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  16. Generalising the logistic map through the q-product

    International Nuclear Information System (INIS)

    Pessoa, R W S; Borges, E P

    2011-01-01

    We investigate a generalisation of the logistic map as x n+1 = 1 - ax n x qmap x n (-1 ≤ x n ≤ 1, 0 map → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for q map > 1 at the edge of chaos, particularly at the first critical point a c , that depends on the value of q map . Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at a c (q map ), and connections with nonextensive statistical mechanics are explored.

  17. Object recognition and generalisation during habituation in horses

    DEFF Research Database (Denmark)

    Christensen, Janne Winther; Zharkikh, Tjatjana; Chovaux, Elodie

    2011-01-01

    The ability of horses to habituate to frightening stimuli greatly increases safety in the horse–human relationship. A recent experiment suggested, however, that habituation to frightening visual stimuli is relatively stimulus-specific in horses and that shape and colour are important factors...... for object generalisation (Christensen et al., 2008). In a series of experiments, we aimed to further explore the ability of horses (n = 30, 1 and 2-year-old mares) to recognise and generalise between objects during habituation. TEST horses (n = 15) were habituated to a complex object, composed of five...... simple objects of varying shape and colour, whereas CONTROL horses (n = 15) were habituated to the test arena, but not to the complex object. In the first experiment, we investigated whether TEST horses subsequently reacted less to i) simple objects that were previously part of the complex object (i...

  18. Inventory and source term evaluation of Russian nuclear power plants for marine applications

    International Nuclear Information System (INIS)

    Reistad, O.; Oelgaard, P.L.

    2006-04-01

    This report discusses inventory and source term properties in regard to operation and possible releases due to accidents from Russian marine reactor systems. The first part of the report discusses relevant accidents on the basis of both Russian and western sources. The overview shows that certain vessels were much more accident prone compared to others, in addition, there have been a noteworthy reduction in accidents the last two decades. However, during the last years new types of incidents, such as collisions, has occurred more frequently. The second part of the study considers in detail the most important factors for the source term; reactor operational characteristics and the radionuclide inventory. While Russian icebreakers has been operated on a similar basis as commercial power plants, the submarines has different power cyclograms which results in considerable lower values for fission product inventory. Theoretical values for radionuclide inventory are compared with computed results using the modelling tool HELIOS. Regarding inventory of transuranic elements, the results of the calculations are discussed in detail for selected vessels. Criticality accidents, loss-of-cooling accidents and sinking accidents are considered, bases on actual experiences with these types of accident and on theoretical considerations, and source terms for these accidents are discussed in the last chapter. (au)

  19. Inventory and source term evaluation of Russian nuclear power plants for marine applications

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, O. [Norwegian Radiation Protection Authority (Norway); Oelgaard, P.L. [Risoe National Lab. (Denmark)

    2006-04-15

    This report discusses inventory and source term properties in regard to operation and possible releases due to accidents from Russian marine reactor systems. The first part of the report discusses relevant accidents on the basis of both Russian and western sources. The overview shows that certain vessels were much more accident prone compared to others, in addition, there have been a noteworthy reduction in accidents the last two decades. However, during the last years new types of incidents, such as collisions, has occurred more frequently. The second part of the study considers in detail the most important factors for the source term; reactor operational characteristics and the radionuclide inventory. While Russian icebreakers has been operated on a similar basis as commercial power plants, the submarines has different power cyclograms which results in considerable lower values for fission product inventory. Theoretical values for radionuclide inventory are compared with computed results using the modelling tool HELIOS. Regarding inventory of transuranic elements, the results of the calculations are discussed in detail for selected vessels. Criticality accidents, loss-of-cooling accidents and sinking accidents are considered, bases on actual experiences with these types of accident and on theoretical considerations, and source terms for these accidents are discussed in the last chapter. (au)

  20. Learning and Generalisation in Neural Networks with Local Preprocessing

    OpenAIRE

    Kutsia, Merab

    2007-01-01

    We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the net...

  1. Rare case of generalised aggressive periodontitis in the primary dentition

    OpenAIRE

    Spoerri, A; Signorelli, C; Erb, J; van Waes, H; Schmidlin, P R

    2014-01-01

    BACKGROUND Generalised aggressive periodontitis (AP) in the prepubescent age is an exceptionally rare disease in the primary dentition of otherwise healthy children. Characteristics of AP are gingival inflammation, deep periodontal pockets, bone loss, tooth mobility and even tooth loss. The most common way of treating this disease is the extraction of all the involved primary teeth. CASE REPORT A 4-year-old girl presented with signs of severe gingival inflammation. Clinical examination rev...

  2. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  3. Ex vivo determined experimental correction factor for the ultrasonic source term in the bioheat equation.

    Science.gov (United States)

    Cortela, Guillermo A; Pereira, Wagner C A; Negreira, Carlos A

    2018-01-01

    The objective of this work is to propose an effective absorption coefficient (α effec ) as an empirical correction factor in the source term of the bioheat equation. The temperature rise in biological tissue due to ultrasound insonification is produced by energy absorption. Usually, the ultrasonic absorption coefficient (α A ) is used as a source term in the bioheat equation to quantify the temperature rise, and the effect of scattering is disregarded. The coefficient α effec includes the scattering contribution as an additional absorption term and should allow us to make a better estimation of the thermal dose (TD), which is important for clinical applications. We simulated the bioheat equation with the source term considering α A or α effec , and with heating provided by therapeutic ultrasound (1MHz, 2.0Wcm -2 ) for about 5.5min (temperature range 36-46°C). Experimental data were obtained in similar heating conditions for a bovine muscle tissue (ex vivo) and temperature curves were measured for depths 7, 30, 35, 40 and 45mm. The TD values from the experimental temperature curves at each depth were compared with the numerical solution of the bioheat equation with the classical and corrected source terms. The highest percentual difference between simulated and experimental TD was 42.5% when assuming the classical α A , and 8.7% for the corrected α effec . The results show that the effective absorption coefficient is a feasible parameter to improve the classical bioheat transfer model, especially for depths larger than the mean free propagation path. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A Generalised Approach to Petri Nets and Algebraic Specifications

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1998-02-01

    The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques

  5. A well-balanced scheme for Ten-Moment Gaussian closure equations with source term

    Science.gov (United States)

    Meena, Asha Kumari; Kumar, Harish

    2018-02-01

    In this article, we consider the Ten-Moment equations with source term, which occurs in many applications related to plasma flows. We present a well-balanced second-order finite volume scheme. The scheme is well-balanced for general equation of state, provided we can write the hydrostatic solution as a function of the space variables. This is achieved by combining hydrostatic reconstruction with contact preserving, consistent numerical flux, and appropriate source discretization. Several numerical experiments are presented to demonstrate the well-balanced property and resulting accuracy of the proposed scheme.

  6. Diffusion-dispersion limits for multidimensional scalar conservation laws with source terms

    Science.gov (United States)

    Kwon, Young-Sam

    In this paper we consider conservation laws with diffusion and dispersion terms. We study the convergence for approximation applied to conservation laws with source terms. The proof is based on the Hwang and Tzavaras's new approach [Seok Hwang, Athanasios E. Tzavaras, Kinetic decomposition of approximate solutions to conservation laws: Application to relaxation and diffusion-dispersion approximations, Comm. Partial Differential Equations 27 (5-6) (2002) 1229-1254] and the kinetic formulation developed by Lions, Perthame, and Tadmor [P.-L. Lions, B. Perthame, E. Tadmor, A kinetic formulation of multidimensional scalar conservation laws and related equations, J. Amer. Math. Soc. 7 (1) (1994) 169-191].

  7. Overview of waste isoltaion safety assessment program and description of source term characterization task at PNL

    International Nuclear Information System (INIS)

    Bradley, D.

    1977-01-01

    A project is being conducted to develop and illustrate the methods and obtain the data necessary to assess the safety of long-term disposal of high-level radioactive waste in geologic formations. The methods and data will initially focus on generic geologic isolation systems but will ultimately be applied to the long-term safety assessment of specific candidate sites that are selected in the NWTS Program. The activities of waste isolation safety assessment (WISAP) are divided into six tasks: (1) Safety Assessment Concepts and Methods, (2) Disruptive Event Analysis, (3) Source Characterization, (4) Transport Modeling, (5) Transport Data and (6) Societal Acceptance

  8. Final report of the inter institutional project ININ-CNSNS 'Source Terms specific for the CNLV'

    International Nuclear Information System (INIS)

    Anaya M, R.A.

    1991-02-01

    The purpose of the project inter institutional ININ-CNSNS 'Source Terms Specifies for the CNLV' it is the one of implanting in the computer CYBER (CDC 180-830) of the ININ, the 'Source Term Code Package' (STCP) and to make the operation tests and corresponding operation using the data of the sample problem, for finally to liberate the package, all time that by means of the analysis of the results it is consider appropriate. In this report the results of the are presented simulation of the sequence 'Energy Losses external' (Station blackout) and 'Lost total of CA with failure of the RCIC and success of the HPCS' both with data of the Laguna Verde Central. (Author)

  9. Source terms for analysis of accidents at a high level waste repository

    International Nuclear Information System (INIS)

    Mubayi, V.; Davis, R.E.; Youngblood, R.

    1989-01-01

    This paper describes an approach to identifying source terms from possible accidents during the preclosure phase of a high-level nuclear waste repository. A review of the literature on repository safety analyses indicated that source term estimation is in a preliminary stage, largely based on judgement-based scoping analyses. The approach developed here was to partition the accident space into domains defined by certain threshold values of temperature and impact energy density which may arise in potential accidents and specify release fractions of various radionuclides, present in the waste form, in each domain. Along with a more quantitative understanding of accident phenomenology, this approach should help in achieving a clearer perspective on scenarios important to preclosure safety assessments of geologic repositories. 18 refs., 3 tabs

  10. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    International Nuclear Information System (INIS)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs

  11. Progress on source term evaluation of accidental events in the experimental fusion installation ITER

    Energy Technology Data Exchange (ETDEWEB)

    Virot, F.; Barrachin, M.; Vola, D.

    2015-10-15

    Highlights: • Progress of the IRSN R&D activities related to the safety assessment of the ITER installation. • Simulation of an accidental scenario with the ASTEC code: loss of coolant in port cell and in vacuum vessel. • Location and chemical speciation of beryllium dusts and tritium. - Abstract: The French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) in support to the French nuclear safety authority performs the safety analyses of the ITER experimental installation. We present the progress in the R&D activities related to a better evaluation of the source term in the event of an accident in this installation. These improvements are illustrated by an evaluation of the source term of a LOCA transient with the dedicated ASTEC code.

  12. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  13. Bayesian source term estimation of atmospheric releases in urban areas using LES approach.

    Science.gov (United States)

    Xue, Fei; Kikumoto, Hideki; Li, Xiaofeng; Ooka, Ryozo

    2018-05-05

    The estimation of source information from limited measurements of a sensor network is a challenging inverse problem, which can be viewed as an assimilation process of the observed concentration data and the predicted concentration data. When dealing with releases in built-up areas, the predicted data are generally obtained by the Reynolds-averaged Navier-Stokes (RANS) equations, which yields building-resolving results; however, RANS-based models are outperformed by large-eddy simulation (LES) in the predictions of both airflow and dispersion. Therefore, it is important to explore the possibility of improving the estimation of the source parameters by using the LES approach. In this paper, a novel source term estimation method is proposed based on LES approach using Bayesian inference. The source-receptor relationship is obtained by solving the adjoint equations constructed using the time-averaged flow field simulated by the LES approach based on the gradient diffusion hypothesis. A wind tunnel experiment with a constant point source downwind of a single building model is used to evaluate the performance of the proposed method, which is compared with that of the existing method using a RANS model. The results show that the proposed method reduces the errors of source location and releasing strength by 77% and 28%, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Analytical source term optimization for radioactive releases with approximate knowledge of nuclide ratios

    Science.gov (United States)

    Hofman, Radek; Seibert, Petra; Kovalets, Ivan; Andronopoulos, Spyros

    2015-04-01

    We are concerned with source term retrieval in the case of an accident in a nuclear power with off-site consequences. The goal is to optimize atmospheric dispersion model inputs using inverse modeling of gamma dose rate measurements (instantaneous or time-integrated). These are the most abundant type of measurements provided by various radiation monitoring networks across Europe and available continuously in near-real time. Usually, a source term of an accidental release comprises of a mixture of nuclides. Unfortunately, gamma dose rate measurements do not provide a direct information on the source term composition; however, physical properties of respective nuclides (deposition properties, decay half-life) can yield some insight. In the method presented, we assume that nuclide ratios are known at least approximately, e.g. from nuclide specific observations or reactor inventory and assumptions on the accident type. The source term can be in multiple phases, each being characterized by constant nuclide ratios. The method is an extension of a well-established source term inversion approach based on the optimization of an objective function (minimization of a cost function). This function has two quadratic terms: mismatch between model and measurements weighted by an observation error covariance matrix and the deviation of the solution from a first guess weighted by the first-guess error covariance matrix. For simplicity, both error covariance matrices are approximated as diagonal. Analytical minimization of the cost function leads to a liner system of equations. Possible negative parts of the solution are iteratively removed by the means of first guess error variance reduction. Nuclide ratios enter the problem in the form of additional linear equations, where the deviations from prescribed ratios are weighted by factors; the corresponding error variance allows us to control how strongly we want to impose the prescribed ratios. This introduces some freedom into the

  15. The influence of damping and source terms on solutions of nonlinear wave equations

    Directory of Open Access Journals (Sweden)

    Mohammad A. Rammaha

    2007-11-01

    Full Text Available We discuss in this paper some recent development in the study of nonlinear wave equations. In particular, we focus on those results that deal with wave equations that feature two competing forces.One force is a damping term and the other is a strong source. Our central interest here is to analyze the influence of these forces on the long-time behavior of solutions.

  16. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  17. Back-calculation of source terms by hybrid genetic algorithm in nuclear power plant accident

    International Nuclear Information System (INIS)

    Ning Shasha; Kuai Linping

    2012-01-01

    To address the issue of nuclear accident's consequence assessment and source terms inversion which is of common concern at home and abroad, hybrid genetic algorithm combined puff model was used to back-calculate the source terms including the release rate and the location. The results of comparing the genetic algorithm-Nelder Mead (GA-NM) method with the genetic algorithm-pattern search (GA-PS) method, genetic algorithm (GA) method and Nelder Mead (NM) method show that GA-NM method not only combines the advantages of GA method and NM method, but also compensates the shortcomings of the two algorithms. The inverse value can be exactly match the expected one. Dispersion model module, GA module and NM module can be combined straightforward, and the code used to combine them is very simple, so GA-NM method has a wide versatility. As the calculation of GA module and NM module is less costly, GA-NM method can be used for rapid estimation of the nuclear power plant source terms. (authors)

  18. The Chernobyl reactor accident source term: development of a consensus view

    International Nuclear Information System (INIS)

    Devell, L.; Guntay, S.; Powers, D.A.

    1995-11-01

    Ten years after the reactor accident at Chernobyl, a great deal more data is available concerning the events, phenomena, and processes that took place. The purpose of this document is to examine what is known about the radioactive materials released during the accident, a task that is substantially more difficult than it might first appear to be. The Chernobyl station, like other nuclear power plants, was not instrumented to characterize a disastrous accident. The accident was peculiar in the sense that radioactive materials were released, at least initially, in an exceptionally energetic plume and were transported far from the reactor site. Release of radioactivity from the plant continued for several days. Characterization of the contamination caused by the releases of radioactivity has had a much lower priority than remediation of the contamination. Consequently, an assessment of the Chernobyl accident source term must rely to a significant extent on inferential evidence. The assessment presented here begins with an examination of the core inventories of radioactive materials. In subsequent sections of the report, the magnitude and timing of the releases of radioactivity are described. Then, the composition, chemical forms, and physical forms of the releases are discussed. A number of more recent publications and results from scientists in Russia and elsewhere have significantly improved the understanding of the Chernobyl source term. Because of the special features of the reactor design and the peculiarities of the Chernobyl accident, the source term for the Chernobyl accident is of limited applicability to the safety analysis of other types of reactors

  19. Modelling and simulation the radioactive source-term of fission products in PWR type reactors

    International Nuclear Information System (INIS)

    Porfirio, Rogilson Nazare da Silva

    1996-01-01

    The source-term was defined with the purpose the quantify all radioactive nuclides released the nuclear reactor in the case of accidents. Nowadays the source-term is limited to the coolant of the primary circuit of reactors and may be measured or modelled with computer coders such as the TFP developed in this work. The calculational process is based on the linear chain techniques used in the CINDER-2 code. The TFP code considers forms of fission products release from the fuel pellet: Recoil, Knockout and Migration. The release from the gap to the coolant fluid is determined from the ratio between activity measured in the coolant and calculated activity in the gap. Considered the operational data of SURRY-1 reactor, the TFP code was run to obtain the source=term of this reactor. From the measured activities it was verified the reliability level of the model and the employed computational logic. The accuracy of the calculated quantities were compared to the measured data was considered satisfactory. (author)

  20. Development of the source term PIRT based on findings during Fukushima Daiichi NPPs accident

    Energy Technology Data Exchange (ETDEWEB)

    Suehiro, Shoichi, E-mail: suehiro-shouichi@tepsys.co.jp [TEPCO SYSTEMS Co., 2-37-28 Eitai, Koto-Ku, Tokyo 135-0034 (Japan); Sugimoto, Jun [Kyoto University, Yoshida Sakyo, Kyoto 606-8501 (Japan); Hidaka, Akihide [Japan Atomic Energy Agency, Tokai-mura, Naka-gun, Ibaraki 319-1195 (Japan); Okada, Hidetoshi [The Institute of Applied Energy, 14-2 Nishi-Shimbashi 1-Chome, Minato-ku, Tokyo 105-0003 (Japan); Mizokami, Shinya [Tokyo Electric Power Company, 1-3 Uchisaiwai-cho 1-Chome, Chiyoda-ku, Tokyo 100-8560 (Japan); Okamoto, Koji [The University of Tokyo, 2-22 Shirakata, Tokai-mura, Ibaraki 319-1188 (Japan)

    2015-05-15

    Highlights: • We developed the source term PIRT based on findings during the Fukushima accident. • The FoM is the masses or fractions of radionuclides released into the environment. • 68 phenomena were identified as influencing to the FoM. • Radionuclide release from molten fuel had the highest score in the early phase. • MCCI, iodine chemistry, and chemical form had the highest score in the later phase. - Abstract: Research Expert Committee on Evaluation of Severe Accident of AESJ (Atomic Energy Society of Japan) has developed thermal hydraulic PIRT (Phenomena Identification and Ranking Table) and source term (ST) PIRT based on findings during the Fukushima Daiichi NPPs accident. These PIRTs aim to explore the debris distribution and the current condition in the NPPs with high accuracy and to extract higher priority from the aspect of the sophistication of the analytical technology to predict the severe accident phenomena by the analytical codes. The ST PIRT is divided into 3 phases for time domain and 9 categories for spatial domain. The 68 phenomena have been extracted and the importance from the viewpoint of the source term has been ranked through brainstorming and discussions among experts. The present paper describes the developed ST PIRT list and summarizes the high ranked phenomena in each phase.

  1. Low-level waste disposal performance assessments - Total source-term analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wilhite, E.L.

    1995-12-31

    Disposal of low-level radioactive waste at Department of Energy (DOE) facilities is regulated by DOE. DOE Order 5820.2A establishes policies, guidelines, and minimum requirements for managing radioactive waste. Requirements for disposal of low-level waste emplaced after September 1988 include providing reasonable assurance of meeting stated performance objectives by completing a radiological performance assessment. Recently, the Defense Nuclear Facilities Safety Board issued Recommendation 94-2, {open_quotes}Conformance with Safety Standards at Department of Energy Low-Level Nuclear Waste and Disposal Sites.{close_quotes} One of the elements of the recommendation is that low-level waste performance assessments do not include the entire source term because low-level waste emplaced prior to September 1988, as well as other DOE sources of radioactivity in the ground, are excluded. DOE has developed and issued guidance for preliminary assessments of the impact of including the total source term in performance assessments. This paper will present issues resulting from the inclusion of all DOE sources of radioactivity in performance assessments of low-level waste disposal facilities.

  2. An appreciation of the events, models and data used for LMFBR radiological source term estimations

    International Nuclear Information System (INIS)

    Keir, D.; Clough, P.N.

    1989-01-01

    In this report, the events, models and data currently available for analysis of accident source terms in liquid metal cooled fast neutron reactors are reviewed. The types of hypothetical accidents considered are the low probability, more extreme types of severe accident, involving significant degradation of the core and which may lead to the release of radionuclides. The base case reactor design considered is a commercial scale sodium pool reactor of the CDFR type. The feasibility of an integrated calculational approach to radionuclide transport and speciation (such as is used for LWR accident analysis) is explored. It is concluded that there is no fundamental obstacle, in terms of scientific data or understanding of the phenomena involved, to such an approach. However this must be regarded as a long-term goal because of the large amount of effort still required to advance development to a stage comparable with LWR studies. Particular aspects of LMFBR severe accident phenomenology which require attention are the behaviour of radionuclides during core disruptive accident bubble formation and evolution, and during the less rapid sequences of core melt under sodium. The basic requirement for improved thermal hydraulic modelling of core, coolant and structural materials, in these and other scenarios, is highlighted as fundamental to the accuracy and realism of source term estimations. The coupling of such modelling to that of radionuclide behaviour is seen as the key to future development in this area

  3. The LLNL Nevada Test Side underground radionuclide source-term inventory

    Energy Technology Data Exchange (ETDEWEB)

    Wild, J.F.; Goishi, W.; Meadows, J.W. [and others

    1995-03-01

    The potential for the contamination of ground water beneath the Nevada Test Site (NTS) by nuclear testing has long been recognized. The United States has conducted underground nuclear weapons testing at NTS since 1957, and a considerable amount of radioactive material has been deposited in the subsurface by this work. As a part of the U.S. Department of Energy Nevada Operations Office`s Underground Test Area Operable Unit (UGTA OP), the Lawrence Livermore National Laboratory (LLNL) has compiled an inventory of radionuclides produced by underground LLNL weapons tests from 1957 through 1992. It is well known that some groundwater at NTS has been contaminated by radionuclides from weapons testing. Nearly one-third of the nuclear tests were conducted near or beneath the pre-test static water level (SWL). An important responsibility of the UGTA OP is to assess the migration potential of contaminants beneath the NTS and surrounding lands. Except for tritium ({sup 3}H), which is capable of migration with water as molecular HTO, the ability of radionuclides to migrate significant distances from their source is presently thought to be very low. However, before this potential for migration can be fully assessed, the quantity of existing contaminants must be carefully estimated. The inventory of the radionuclide source term provides an upper limit on the availability of radionuclides for migration. However, an accurate assessment of risk to the public depends on more than an inventory of radionuclides remaining from underground testing. An estimate of the hydrologic source term consisting of radionuclides dissolved in or transported by ground water must compliment the radionuclide source term.

  4. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    Energy Technology Data Exchange (ETDEWEB)

    Icenhour, A.S.; Tharp, M.L.

    1996-08-01

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.

  5. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    Directory of Open Access Journals (Sweden)

    N. Evangeliou

    2017-07-01

    Full Text Available This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30–50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km than previously assumed (≈ 2.2 km in order

  6. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    Science.gov (United States)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration

  7. ANSI N14.5 source term licensing of spent-fuel transport cask containment

    International Nuclear Information System (INIS)

    Seager, K.D.; Reardon, P.C.; James, R.J.; Foadian, H.; Rashid, Y.R.

    1993-01-01

    American National Standards Institute (ANSI) standard N14.5 states that ''compliance with package containment requirements shall be demonstrated either by determination of the radioactive contents release rate or by measurement of a tracer material leakage rate.'' The maximum permissible leakage rate from the transport cask is equal to the maximum permissible release rate divided by the time-averaged volumetric concentration of suspended radioactivity within the cask. The development of source term methodologies at Sandia National Laboratories (SNL) provides a means to determine the releasable radionuclide concentrations within spent-fuel transport casks by estimating the probability of cladding breach, quantifying the amount of radioactive material released into the cask interior from the breached fuel rods, and quantifying the amount of radioactive material within the cask due to other sources. These methodologies are implemented in the Source Term Analyses for Containment Evaluations (STACE) software. In this paper, the maximum permissible leakage rates for the normal and hypothetical accident transport conditions defined by 10 CFR 71 are estimated using STACE for a given cask design, fuel assembly, and initial conditions. These calculations are based on defensible analysis techniques that credit multiple release barriers, including the cladding and the internal cask walls

  8. Long-term observations of x-ray sources: The aquila-serpens-scutum region

    Energy Technology Data Exchange (ETDEWEB)

    Priedhorsky, W.C.; Terrell, J.

    1984-05-15

    We present long-term (1969-1976) observations of galactic X-ray sources in the Aquila-Serpens-Scutum region. Data were obtained by the 3--12 keV detector on the Vela 5B satellite. The time histories of nine sources were derived from sky maps of this confused region: Scutum X-1, 4U 1823--00, 4U 1915--05, Aquila X-1, Serpens X-1, 4U 1907+09, A1850-087, 4U 1901+03, and 4U 1957+11. These observations reveal new long-term variations for several sources. AqlX-1, which averages about one eruption per year, is shown to have an underlying cycle of 122-125 days, with an rms phase walk of 10% per cycle. A regular 199 day period is observed from the vicinity of 4U 1915-05. This period appears too long to reconcile with the standard model of that system as a low-mass binary with a 50 minute period. The OB system 4U 1907+09 shows a possible 41.6 day period, with substantial phase jitter. We discuss these variations as possible precessions in binary systems.

  9. The Multimedia Environmental Pollutant Assessment System (MEPAS)reg-sign: Source-term release formulations

    International Nuclear Information System (INIS)

    Streile, G.P.; Shields, K.D.; Stroh, J.L.; Bagaasen, L.M.; Whelan, G.; McDonald, J.P.; Droppo, J.G.; Buck, J.W.

    1996-11-01

    This report is one of a series of reports that document the mathematical models in the Multimedia Environmental Pollutant Assessment System (MEPAS). Developed by Pacific Northwest National Laboratory for the US Department of Energy, MEPAS is an integrated impact assessment software implementation of physics-based fate and transport models in air, soil, and water media. Outputs are estimates of exposures and health risk assessments for radioactive and hazardous pollutants. Each of the MEPAS formulation documents covers a major MEPAS component such as source-term, atmospheric, vadose zone/groundwater, surface water, and health exposure/health impact assessment. Other MEPAS documentation reports cover the sensitivity/uncertainty formulations and the database parameter constituent property estimation methods. The pollutant source-term release component is documented in this report. MEPAS simulates the release of contaminants from a source, transport through the air, groundwater, surface water, or overland pathways, and transfer through food chains and exposure pathways to the exposed individual or population. For human health impacts, risks are computed for carcinogens and hazard quotients for noncarcinogens. MEPAS is implemented on a desktop computer with a user-friendly interface that allows the user to define the problem, input the required data, and execute the appropriate models for both deterministic and probabilistic analyses

  10. Accident source terms for light-water nuclear power plants using high-burnup or MOX fuel.

    Energy Technology Data Exchange (ETDEWEB)

    Salay, Michael (U.S. Nuclear Regulatory Commission, Washington, D.C.); Gauntt, Randall O.; Lee, Richard Y. (U.S. Nuclear Regulatory Commission, Washington, D.C.); Powers, Dana Auburn; Leonard, Mark Thomas

    2011-01-01

    Representative accident source terms patterned after the NUREG-1465 Source Term have been developed for high burnup fuel in BWRs and PWRs and for MOX fuel in a PWR with an ice-condenser containment. These source terms have been derived using nonparametric order statistics to develop distributions for the timing of radionuclide release during four accident phases and for release fractions of nine chemical classes of radionuclides as calculated with the MELCOR 1.8.5 accident analysis computer code. The accident phases are those defined in the NUREG-1465 Source Term - gap release, in-vessel release, ex-vessel release, and late in-vessel release. Important differences among the accident source terms derived here and the NUREG-1465 Source Term are not attributable to either fuel burnup or use of MOX fuel. Rather, differences among the source terms are due predominantly to improved understanding of the physics of core meltdown accidents. Heat losses from the degrading reactor core prolong the process of in-vessel release of radionuclides. Improved understanding of the chemistries of tellurium and cesium under reactor accidents changes the predicted behavior characteristics of these radioactive elements relative to what was assumed in the derivation of the NUREG-1465 Source Term. An additional radionuclide chemical class has been defined to account for release of cesium as cesium molybdate which enhances molybdenum release relative to other metallic fission products.

  11. On quantization, the generalised Schroedinger equation and classical mechanics

    International Nuclear Information System (INIS)

    Jones, K.R.W.

    1991-01-01

    A ψ-dependent linear functional operator, was defined, which solves the problem of quantization in non-relativistic quantum mechanics. Weyl ordering is implemented automatically and permits derivation of many of the quantum to classical correspondences. The parameter λ presents a natural C ∞ deformation of the dynamical structure of quantum mechanics via a non-linear integro-differential 'Generalised Schroedinger Equation', admitting an infinite family of soliton solutions. All these solutions are presented and it is shown that this equation gives an exact dynamic and energetic reproduction of classical mechanics with the correct measurement theoretic limit. 23 refs

  12. Generalised extreme value statistics and sum of correlated variables

    OpenAIRE

    Bertin, Eric; Clusel, Maxime

    2006-01-01

    To appear in J.Phys.A; We show that generalised extreme value statistics -the statistics of the k-th largest value among a large set of random variables- can be mapped onto a problem of random sums. This allows us to identify classes of non-identical and (generally) correlated random variables with a sum distributed according to one of the three (k-dependent) asymptotic distributions of extreme value statistics, namely the Gumbel, Frechet and Weibull distributions. These classes, as well as t...

  13. Building Abelian Functions with Generalised Baker-Hirota Operators

    Directory of Open Access Journals (Sweden)

    Matthew England

    2012-06-01

    Full Text Available We present a new systematic method to construct Abelian functions on Jacobian varieties of plane, algebraic curves. The main tool used is a symmetric generalisation of the bilinear operator defined in the work of Baker and Hirota. We give explicit formulae for the multiple applications of the operators, use them to define infinite sequences of Abelian functions of a prescribed pole structure and deduce the key properties of these functions. We apply the theory on the two canonical curves of genus three, presenting new explicit examples of vector space bases of Abelian functions. These reveal previously unseen similarities between the theories of functions associated to curves of the same genus.

  14. Generalised Hermite–Gaussian beams and mode transformations

    International Nuclear Information System (INIS)

    Wang, Yi; Chen, Yujie; Zhang, Yanfeng; Chen, Hui; Yu, Siyuan

    2016-01-01

    Generalised Hermite–Gaussian modes (gHG modes), an extended notion of Hermite–Gaussian modes (HG modes), are formed by the summation of normal HG modes with a characteristic function α, which can be used to unite conventional HG modes and Laguerre–Gaussian modes (LG modes). An infinite number of normalised orthogonal modes can thus be obtained by modulation of the function α. The gHG mode notion provides a useful tool in analysis of the deformation and transformation phenomena occurring in propagation of HG and LG modes with astigmatic perturbation. (paper)

  15. Limits of the generalised Tomimatsu-Sato gravitational fields

    International Nuclear Information System (INIS)

    Cosgrove, C.M.

    1977-01-01

    In a previous paper (Cosgrove. J. Phys. A.; 10:1481 (1977)), the author presented a new three-parameter family of exact asymptotically flat stationary axisymmetric vacuum solutions of Einstein's equations which contains the solutions of Kerr and Tomimatsu-Sato (TS) as special cases. In this paper, two interesting special cases of the previous family which must be constructed by a limiting process are considered. These are interpreted as a 'rotating Curzon metric' and a 'generalised extreme Kerr metric'. In addition, approximate forms for the original metrics are given for the cases of slow rotation and small deformation. (author)

  16. Glucose production in pregnant women at term gestation. Sources of glucose for human fetus.

    Science.gov (United States)

    Kalhan, S C; D'Angelo, L J; Savin, S M; Adam, P A

    1979-03-01

    The effects of pregnancy and diabetes on systemic glucose production rates and the sources of glucose for the human fetus in utero were evaluated in five normal, four gestationally diabetic, and one insulin-dependent diabetic subject undergoing elective caesarean section at term gestation. Five normal nonpregnant women were studied for comparison. Systemic glucose production rates were measured with stable tracer [1-(13)C]glucose according to the prime-constant rate infusion technique. Even though the plasma glucose concentration during normal pregnancy had declined as compared with the nonpregnant subjects (P fetus at term gestation. The decline in glucose concentration could be the result of an increase in apparent volume of distribution of glucose. Systemic glucose production rates in well-controlled, gestationally diabetic subjects were similar to those in normal pregnant subjects (2.07+/-0.53 vs. 2.42+/-0.51 mg/kg.min). The sources of glucose for the human fetus at term gestation were evaluated by comparing (a) natural variation in (13)C:(12)C ratio of plasma glucose and (b) enriched (13)C:(12)C ratio of plasma glucose during [1-(13)C]glucose infusion in maternal and fetal blood at delivery in both normal and diabetic subjects. These data showed that the fetal glucose pool was in equilibrium with the maternal glucose pool in both normal and diabetic subjects, indicating that a brief maternal fast did not initiate systemic glucose production in human fetus. A materno-fetal gradient was observed for betahydroxybutyrate.

  17. Long-term storage life of light source modules by temperature cycling accelerated life test

    International Nuclear Information System (INIS)

    Sun Ningning; Tan Manqing; Li Ping; Jiao Jian; Guo Xiaofeng; Guo Wentao

    2014-01-01

    Light source modules are the most crucial and fragile devices that affect the life and reliability of the interferometric fiber optic gyroscope (IFOG). While the light emitting chips were stable in most cases, the module packaging proved to be less satisfactory. In long-term storage or the working environment, the ambient temperature changes constantly and thus the packaging and coupling performance of light source modules are more likely to degrade slowly due to different materials with different coefficients of thermal expansion in the bonding interface. A constant temperature accelerated life test cannot evaluate the impact of temperature variation on the performance of a module package, so the temperature cycling accelerated life test was studied. The main failure mechanism affecting light source modules is package failure due to solder fatigue failure including a fiber coupling shift, loss of cooling efficiency and thermal resistor degradation, so the Norris-Landzberg model was used to model solder fatigue life and determine the activation energy related to solder fatigue failure mechanism. By analyzing the test data, activation energy was determined and then the mean life of light source modules in different storage environments with a continuously changing temperature was simulated, which has provided direct reference data for the storage life prediction of IFOG. (semiconductor devices)

  18. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  19. Basic repository source term and data sheet report: Deaf Smith County

    International Nuclear Information System (INIS)

    1987-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Deaf Smith County, Texas. 2 refs., 6 tabs

  20. Basic repository source term and data sheet report, Cypress Creek Dome: Draft

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Cypress Creek Dome, Mississippi. 2 refs., 6 tabs

  1. Source term analysis for a criticality accident in metal production line glove boxes

    International Nuclear Information System (INIS)

    Nguyen, D.H.

    1991-06-01

    A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion calculations. We have used an integrated computer model to determine the release of fission products to the environment from a hypothetical criticality event in a glove box of the metal production line (MPL) at the Lawrence Livermore National Laboratory (LLNL)

  2. The source term and waste optimization of molten salt reactors with processing

    International Nuclear Information System (INIS)

    Gat, U.; Dodds, H.L.

    1993-01-01

    The source term of a molten salt reactor (MSR) with fuel processing is reduced by the ratio of processing time to refueling time as compared to solid fuel reactors. The reduction, which can be one to two orders of magnitude, is due to removal of the long-lived fission products. The waste from MSRs can be optimized with respect to its chemical composition, concentration, mixture, shape, and size. The actinides and long-lived isotopes can be separated out and returned to the reactor for transmutation. These features make MSRs more acceptable and simpler in operation and handling

  3. The uranium source-term mineralogy and geochemistry at the Broubster natural analogue site, Caithness

    International Nuclear Information System (INIS)

    Milodowski, A.E.; Pearce, J.M.; Basham, I.R.; Hyslop, E.K.

    1991-01-01

    The British Geological Survey (BGS) has been conducting a coordinated research programme at the Broubster natural analogue site in Caithness, north Scotland. This work on a natural radioactive geochemical system has been carried out with the aim of improving our confidence in using predictive models of radionuclide migration in the geosphere. This report is one of a series being produced and it concentrates on the mineralogical characterization of the uranium distribution in the limestone unit considered as the 'source-term' in the natural analogue model

  4. A source term and risk calculations using level 2+PSA methodology

    International Nuclear Information System (INIS)

    Park, S. I.; Jea, M. S.; Jeon, K. D.

    2002-01-01

    The scope of Level 2+ PSA includes the assessment of dose risk which is associated with the exposures of the radioactive nuclides escaping from nuclear power plants during severe accidents. The establishment of data base for the exposure dose in Korea nuclear power plants may contribute to preparing the accident management programs and periodic safety reviews. In this study the ORIGEN, MELCOR and MACCS code were employed to produce a integrated framework to assess the radiation source term risk. The framework was applied to a reference plant. Using IPE results, the dose rate for the reference plant was calculated quantitatively

  5. Description of apparatus for determining radiological source terms of nuclear fuels

    International Nuclear Information System (INIS)

    Baldwin, D.L.; Woodley, R.E.; Holt, F.E.; Archer, D.V.; Steele, R.T.; Whitkop, P.G.

    1985-01-01

    New apparatus have been designed, built and are currently being employed to measure the release of volatile fission products from irradiated nuclear fuel. The system is capable of measuring radiological source terms, particularly for cesium-137, cesium-134, iodine-129 and krypton-85, in various atmospheres at temperatures up to 1200 0 C. The design allows a rapid transient heatup from ambient to full temperature, a hold at maximum temperature for a specified period, and rapid cooldown. Released fission products are measured as deposition on a platinum thermal gradient tube or in a filter/charcoal trap. Noble gases pass through to a multi-channel gamma analyzer. 1 ref., 4 figs

  6. Adiabatic energization in the ring current and its relation to other source and loss terms

    Science.gov (United States)

    Liemohn, M. W.; Kozyra, J. U.; Clauer, C. R.; Khazanov, G. V.; Thomsen, M. F.

    2002-04-01

    The influence of adiabatic energization and deenergization effects, caused by particle drift in radial distance, on ring current growth rates and loss lifetimes is investigated. Growth and loss rates from simulation results of four storms (5 June 1991, 15 May 1997, 19 October 1998, and 25 September 1998) are examined and compared against the y component of the solar wind electric field (Ey,sw). Energy change rates with and without the inclusion of adiabatic energy changes are considered to isolate the influence of this mechanism in governing changes of ring current strength. It is found that the influence of adiabatic drift effects on the energy change rates is very large when energization and deenergization are considered separately as gain and loss mechanisms, often about an order of magnitude larger than all other source or loss terms combined. This is true not only during storm times, when the open drift path configuration of the hot ions dominates the physics of the ring current, but also during quiet times, when the small oscillation in L of the closed trajectories creates a large source and loss of energy each drift orbit. However, the net energy change from adiabatic drift is often smaller than other source and loss processes, especially during quiet times. Energization from adiabatic drift dominates ring current growth only during portions of the main phase of storms. Furthermore, the net-adiabatic energization is often positive, because some particles are lost in the inner magnetosphere before they can adiabatically deenergize. It is shown that the inclusion of only this net-adiabatic drift effect in the total source rate or loss lifetime (depending on the sign of the net-adiabatic energization) best matches the observed source and loss values from empirical Dst predictor methods (that is, for consistency, these values should be compared between the calculation methods). While adiabatic deenergization dominates the loss timescales for all Ey,sw values

  7. Wave propagation speeds and source term influences in single and integral porosity shallow water equations

    Directory of Open Access Journals (Sweden)

    Ilhan Özgen

    2017-10-01

    Full Text Available In urban flood modeling, so-called porosity shallow water equations (PSWEs, which conceptually account for unresolved structures, e.g., buildings, are a promising approach to addressing high CPU times associated with state-of-the-art explicit numerical methods. The PSWE can be formulated with a single porosity term, referred to as the single porosity shallow water model (SP model, which accounts for both the reduced storage in the cell and the reduced conveyance, or with two porosity terms: one accounting for the reduced storage in the cell and another accounting for the reduced conveyance. The latter form is referred to as an integral or anisotropic porosity shallow water model (AP model. The aim of this study was to analyze the differences in wave propagation speeds of the SP model and the AP model and the implications of numerical model results. First, augmented Roe-type solutions were used to assess the influence of the source terms appearing in both models. It is shown that different source terms have different influences on the stability of the models. Second, four computational test cases were presented and the numerical models were compared. It is observed in the eigenvalue-based analysis as well as in the computational test cases that the models converge if the conveyance porosity in the AP model is close to the storage porosity. If the porosity values differ significantly, the AP model yields different wave propagation speeds and numerical fluxes from those of the BP model. In this study, the ratio between the conveyance and storage porosities was determined to be the most significant parameter.

  8. SARNET. Severe Accident Research Network - key issues in the area of source term

    International Nuclear Information System (INIS)

    Giordano, P.; Micaelli, J.C.; Haste, T.; Herranz, L.

    2005-01-01

    About fifty European organisations integrate in SARNET (Network of Excellence of the EU 6 th Framework Programme) their research capacities in resolve better the most important remaining uncertainties and safety issues concerning existing and future Nuclear Power Plants (NPPs) under hypothetical Severe Accident (SA) conditions. Wishing to maintain a long-lasting cooperation, they conduct three types of activities: integrating activities, spreading of excellence and jointly executed research. This paper summarises the main results obtained by the network after the first year, giving more prominence to those from jointly executed research in the Source Term area. Integrating activities have been performed through different means: the ASTEC integral computer code for severe accident transient modelling, through development of PSA2 methodologies, through the setting of a structure for definition of evolving R and D priorities and through the development of a web-network of data bases that hosts experimental data. Such activities have been facilitated by the development of an Advanced Communication Tool. Concerning spreading of excellence, educational courses covering Severe Accident Analysis Methodology and Level 2 PSA have been set up, to be given in early 2006. A detailed text book on Severe Accident Phenomenology has been designed and agreed amongst SARNET members. A mobility programme for students and young researchers is being developed, some detachments are already completed or in progress, and examples are quoted. Jointly executed research activities concern key issues grouped in the Corium, Containment and Source Term areas. In Source Term, behaviour of the highly radio-toxic ruthenium under oxidising conditions (like air ingress) for HBU and MOX fuel has been investigated. First modelling proposals for ASTEC have been made for oxidation of fuel and of ruthenium. Experiments on transport of highly volatile oxide ruthenium species have been performed. Reactor

  9. Regulatory Technology Development Plan Sodium Fast Reactor. Mechanistic Source Term Development

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David S. [Argonne National Lab. (ANL), Argonne, IL (United States); Brunett, Acacia Joann [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, Matthew D. [Argonne National Lab. (ANL), Argonne, IL (United States); Sienicki, James J. [Argonne National Lab. (ANL), Argonne, IL (United States); Sofu, Tanju [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-02-28

    Construction and operation of a nuclear power installation in the U.S. requires licensing by the U.S. Nuclear Regulatory Commission (NRC). A vital part of this licensing process and integrated safety assessment entails the analysis of a source term (or source terms) that represents the release of radionuclides during normal operation and accident sequences. Historically, nuclear plant source term analyses have utilized deterministic, bounding assessments of the radionuclides released to the environment. Significant advancements in technical capabilities and the knowledge state have enabled the development of more realistic analyses such that a mechanistic source term (MST) assessment is now expected to be a requirement of advanced reactor licensing. This report focuses on the state of development of an MST for a sodium fast reactor (SFR), with the intent of aiding in the process of MST definition by qualitatively identifying and characterizing the major sources and transport processes of radionuclides. Due to common design characteristics among current U.S. SFR vendor designs, a metal-fuel, pool-type SFR has been selected as the reference design for this work, with all phenomenological discussions geared toward this specific reactor configuration. This works also aims to identify the key gaps and uncertainties in the current knowledge state that must be addressed for SFR MST development. It is anticipated that this knowledge state assessment can enable the coordination of technology and analysis tool development discussions such that any knowledge gaps may be addressed. Sources of radionuclides considered in this report include releases originating both in-vessel and ex-vessel, including in-core fuel, primary sodium and cover gas cleanup systems, and spent fuel movement and handling. Transport phenomena affecting various release groups are identified and qualitatively discussed, including fuel pin and primary coolant retention, and behavior in the cover gas and

  10. A source term estimation method for a nuclear accident using atmospheric dispersion models

    DEFF Research Database (Denmark)

    Kim, Minsik; Ohba, Ryohji; Oura, Masamichi

    2015-01-01

    The objective of this study is to develop an operational source term estimation (STE) method applicable for a nuclear accident like the incident that occurred at the Fukushima Dai-ichi nuclear power station in 2011. The new STE method presented here is based on data from atmospheric dispersion mo...... and validated through the effort described in this manuscript to estimate the release rate of radioactive material from the Fukushima Dai-ichi nuclear power station.......The objective of this study is to develop an operational source term estimation (STE) method applicable for a nuclear accident like the incident that occurred at the Fukushima Dai-ichi nuclear power station in 2011. The new STE method presented here is based on data from atmospheric dispersion...... models and short-range observational data around the nuclear power plants.The accuracy of this method is validated with data from a wind tunnel study that involved a tracer gas release from a scaled model experiment at Tokai Daini nuclear power station in Japan. We then use the methodology developed...

  11. Projected Source Terms for Potential Sabotage Events Related to Spent Fuel Shipments

    International Nuclear Information System (INIS)

    Luna, R.E.; Neuhauser, K.S.; Vigil, M.G.

    1999-01-01

    Two major studies, one sponsored by the U.S. Department of Energy and the other by the U.S. Nuclear Regulatory Commission, were conducted in the late 1970s and early 1980s to provide information and source terms for an optimally successful act of sabotage on spent fuel casks typical of those available for use. This report applies the results of those studies and additional analysis to derive potential source terms for certain classes of sabotage events on spent fuel casks and spent fuel typical of those which could be shipped in the early decades of the 21st century. In addition to updating the cask and spent fuel characteristics used in the analysis, two release mechanisms not included in the earlier works were identified and evaluated. As would be expected, inclusion of these additional release mechanisms resulted in a somewhat higher total release from the postulated sabotage events. Although health effects from estimated releases were addressed in the earlier study conducted for U.S. Department of Energy, they have not been addressed in this report. The results from this report maybe used to estimate health effects

  12. Source terms; isolation and radiological consequences of carbon-14 waste in the Swedish SFR repository

    International Nuclear Information System (INIS)

    Hesboel, R.; Puigdomenech, I.; Evans, S.

    1990-01-01

    The source term, isolation capacity, and long-term radiological exposure of 14 C from the Swedish underground repository for low and intermediate level waste (SFR) is assessed. The prospective amount of 14 C in the repository is assumed to be 5 TBq. Spent ion exchange resins will be the dominant source of 14 C. The pore water in the concrete repository is expected to maintain a pH of >10.5 for a period of at least 10 6 y. The cement matrix of the repository will retain most of the 14 CO 3 2- initially present. Bacterial production of CO 2 and CH 4 from degradation of ion-exchange resins and bitumen may contribute to 14 C release to the biosphere. However, CH 4 contributes only to a small extent to the overall carbon loss from freshwater ecosystems. The individual doses to local and regional individuals peaked with 5x10 -3 and regional individuals peaked with 5x10 -3 and 8x10 -4 μSv y -1 respectively at about 2.4x10 4 years. A total leakage of 8.4 GBq of 14 C from the repository will cause a total collective dose commitment of 1.1 manSv or 130 manSv TBq -1 . (authors)

  13. Chernobyl radiocesium in freshwater fish: Long-term dynamics and sources of variation

    International Nuclear Information System (INIS)

    Sundbom, M.

    2002-01-01

    The aim of this thesis was to investigate both the long-term temporal pattern and sources of individual variation for radiocesium in freshwater fish. The basis for the study is time series of 137 Cs activity concentrations in fish from three lakes in the area North-west of Uppsala, Sweden that received considerable amounts of 137 Cs from Chernobyl in may 1986. The lakes were Lake Ekholmssjoen, Lake Flatsjoen and Lake Siggeforasjoen, all small forest lakes, but with different morphometrical and chemical characteristics. The data were collected regularly, usually several times per year, during 1986-2000, using consistent methods. More than 7600 fish individuals from 7 species covering wide size ranges and feeding habits were analysed for 137 Cs. For each fish was the length, weight, sex, and often the stomach contend recorded. The evaluation on long-term trends were based on data from all three lakes, while the study on sources of variation evaluated data from Lake Flatsjoen only. (au)

  14. Generalised tetanus in a 2-week-old foal: use of physiotherapy to aid recovery.

    Science.gov (United States)

    Mykkänen, A K; Hyytiäinen, H K; McGowan, C M

    2011-11-01

    A 2-week-old Estonian Draft foal presented with signs of severe generalised tetanus, recumbency and inability to drink. The suspected source of infection was the umbilicus. Medical treatment was administered, including tetanus antitoxin, antimicrobial therapy and phenobarbital to control tetanic spasms. In addition, an intensive physiotherapy program was carried out during the recovery period. Techniques designed for syndromes involving upper motor neuron spasticity in humans were applied. Exercises aimed at weight-bearing and mobility were executed with the help of a walking-frame. The foal made a complete recovery. To our knowledge, this is the first report of the use of physiotherapy in the treatment of tetanus in horses. © 2011 The Authors. Australian Veterinary Journal © 2011 Australian Veterinary Association.

  15. Control configuration selection for bilinear systems via generalised Hankel interaction index array

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza; Tahavori, Maryamsadat

    2015-01-01

    way, an iterative method for solving the generalised Sylvester equation is proposed. The generalised cross-gramian is used to form the generalised Hankel interaction index array. The generalised Hankel interaction index array is used for control configuration selection of MIMO bilinear processes. Most...... importantly, since for each element of generalised Hankel interaction index array just one generalised Sylvester equation is needed to be solved, the proposed control configuration selection method is computationally more efficient than its gramian-based counterparts.......Decentralised and partially decentralised control strategies are very popular in practice. To come up with a suitable decentralised or partially decentralised control structure, it is important to select the appropriate input and output pairs for control design. This procedure is called control...

  16. Long-term dust aerosol production from natural sources in Iceland.

    Science.gov (United States)

    Dagsson-Waldhauserova, Pavla; Arnalds, Olafur; Olafsson, Haraldur

    2017-02-01

    Iceland is a volcanic island in the North Atlantic Ocean with maritime climate. In spite of moist climate, large areas are with limited vegetation cover where >40% of Iceland is classified with considerable to very severe erosion and 21% of Iceland is volcanic sandy deserts. Not only do natural emissions from these sources influenced by strong winds affect regional air quality in Iceland ("Reykjavik haze"), but dust particles are transported over the Atlantic ocean and Arctic Ocean >1000 km at times. The aim of this paper is to place Icelandic dust production area into international perspective, present long-term frequency of dust storm events in northeast Iceland, and estimate dust aerosol concentrations during reported dust events. Meteorological observations with dust presence codes and related visibility were used to identify the frequency and the long-term changes in dust production in northeast Iceland. There were annually 16.4 days on average with reported dust observations on weather stations within the northeastern erosion area, indicating extreme dust plume activity and erosion within the northeastern deserts, even though the area is covered with snow during the major part of winter. During the 2000s the highest occurrence of dust events in six decades was reported. We have measured saltation and Aeolian transport during dust/volcanic ash storms in Iceland, which give some of the most intense wind erosion events ever measured. Icelandic dust affects the ecosystems over much of Iceland and causes regional haze. It is likely to affect the ecosystems of the oceans around Iceland, and it brings dust that lowers the albedo of the Icelandic glaciers, increasing melt-off due to global warming. The study indicates that Icelandic dust may contribute to the Arctic air pollution. Long-term records of meteorological dust observations from Northeast Iceland indicate the frequency of dust events from Icelandic deserts. The research involves a 60-year period and

  17. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  18. Source-term development for a contaminant plume for use by multimedia risk assessment models

    International Nuclear Information System (INIS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    1999-01-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equal importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool

  19. Embolic strokes of undetermined source in young adults: baseline characteristics and long-term outcome.

    Science.gov (United States)

    Martinez-Majander, N; Aarnio, K; Pirinen, J; Lumikari, T; Nieminen, T; Lehto, M; Sinisalo, J; Kaste, M; Tatlisumak, T; Putaala, J

    2018-03-01

    Embolic strokes of undetermined source (ESUS) are a recent entity, not yet thoroughly investigated in young stroke patients. The clinical characteristics and long-term risks of vascular events and all-cause mortality between young-onset ESUS and other aetiological subgroups were compared. Patients with ESUS were identified amongst the 1008 patients aged 15-49 years with first-ever ischaemic stroke in Helsinki Young Stroke Registry, and primary end-points were defined as recurrent stroke, composite vascular events and all-cause mortality. Cumulative 15-year risks for each end-point were analysed with life tables and adjusted risks were based on Cox proportional hazard analyses. Of the 971 eligible patients, 203 (20.9%) were classified as ESUS. They were younger (median age 40 years, interquartile range 32-46 vs. 45 years, 39-47), more often female (43.3% vs. 35.7%) and had fewer cardiovascular risk factors than other modified TOAST groups. With a median follow-up time of 10.1 years, ESUS patients had the second lowest cumulative risk of recurrent stroke and composite vascular events and lowest mortality compared to other TOAST groups. Large-artery atherosclerosis and small vessel disease carried significantly higher risk for recurrent stroke than did ESUS, whilst no difference appeared between cardioembolism from high-risk sources and ESUS. In our cohort, ESUS patients were younger and had milder cardiovascular risk factor burden and generally better long-term outcome compared to other causes of young-onset stroke. The comparable risk of recurrent stroke between ESUS and high-risk sources of cardioembolism might suggest similarities in their pathophysiology. © 2017 EAN.

  20. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States); Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  1. Homogenization of the Brush Problem with a Source Term in L 1

    Science.gov (United States)

    Gaudiello, Antonio; Guibé, Olivier; Murat, François

    2017-07-01

    We consider a domain which has the form of a brush in 3 D or the form of a comb in 2 D, i.e. an open set which is composed of cylindrical vertical teeth distributed over a fixed basis. All the teeth have a similar fixed height; their cross sections can vary from one tooth to another and are not supposed to be smooth; moreover the teeth can be adjacent, i.e. they can share parts of their boundaries. The diameter of every tooth is supposed to be less than or equal to ɛ, and the asymptotic volume fraction of the teeth (as ɛ tends to zero) is supposed to be bounded from below away from zero, but no periodicity is assumed on the distribution of the teeth. In this domain we study the asymptotic behavior (as ɛ tends to zero) of the solution of a second order elliptic equation with a zeroth order term which is bounded from below away from zero, when the homogeneous Neumann boundary condition is satisfied on the whole of the boundary. First, we revisit the problem where the source term belongs to L 2. This is a classical problem, but our homogenization result takes place in a geometry which is more general that the ones which have been considered before. Moreover we prove a corrector result which is new. Then, we study the case where the source term belongs to L 1. Working in the framework of renormalized solutions and introducing a definition of renormalized solutions for degenerate elliptic equations where only the vertical derivative is involved (such a definition is new), we identify the limit problem and prove a corrector result.

  2. The Langevin and generalised Langevin approach to the dynamics of atomic, polymeric and colloidal systems

    CERN Document Server

    Snook, Ian

    2007-01-01

    The Langevin and Generalised Langevin Approach To The Dynamics Of Atomic, Polymeric And Colloidal Systems is concerned with the description of aspects of the theory and use of so-called random processes to describe the properties of atomic, polymeric and colloidal systems in terms of the dynamics of the particles in the system. It provides derivations of the basic equations, the development of numerical schemes to solve them on computers and gives illustrations of application to typical systems.Extensive appendices are given to enable the reader to carry out computations to illustrate many of the points made in the main body of the book.* Starts from fundamental equations* Gives up-to-date illustration of the application of these techniques to typical systems of interest* Contains extensive appendices including derivations, equations to be used in practice and elementary computer codes

  3. DUSTMS-D: DISPOSAL UNIT SOURCE TERM - MULTIPLE SPECIES - DISTRIBUTED FAILURE DATA INPUT GUIDE.

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN, T.M.

    2006-01-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). Many of these physical processes are influenced by the design of the disposal facility (e.g., how the engineered barriers control infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This has been done and the resulting models have been incorporated into the computer code DUST-MS (Disposal Unit Source Term-Multiple Species). The DUST-MS computer code is designed to model water flow, container degradation, release of contaminants from the wasteform to the contacting solution and transport through the subsurface media. Water flow through the facility over time is modeled using tabular input. Container degradation models include three types of failure rates: (a) instantaneous (all containers in a control volume fail at once), (b) uniformly distributed failures (containers fail at a linear rate between a specified starting and ending time), and (c) gaussian failure rates (containers fail at a rate determined by a mean failure time, standard deviation and gaussian distribution). Wasteform release models include four release mechanisms: (a) rinse with partitioning (inventory is released instantly upon container failure subject to equilibrium partitioning (sorption) with

  4. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    Science.gov (United States)

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  5. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants

    Directory of Open Access Journals (Sweden)

    L.A. Poggi

    2016-10-01

    Full Text Available An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  6. A note on variational multiscale methods for high-contrast heterogeneous porous media flows with rough source terms

    KAUST Repository

    Calo, Victor M.

    2011-09-01

    In this short note, we discuss variational multiscale methods for solving porous media flows in high-contrast heterogeneous media with rough source terms. Our objective is to separate, as much as possible, subgrid effects induced by the media properties from those due to heterogeneous source terms. For this reason, enriched coarse spaces designed for high-contrast multiscale problems are used to represent the effects of heterogeneities of the media. Furthermore, rough source terms are captured via auxiliary correction equations that appear in the formulation of variational multiscale methods [23]. These auxiliary equations are localized and one can use additive or multiplicative constructions for the subgrid corrections as discussed in the current paper. Our preliminary numerical results show that one can capture the effects due to both spatial heterogeneities in the coefficients (such as permeability field) and source terms (e.g., due to singular well terms) in one iteration. We test the cases for both smooth source terms and rough source terms and show that with the multiplicative correction, the numerical approximations are more accurate compared to the additive correction. © 2010 Elsevier Ltd.

  7. Optimising, generalising and integrating educational practice using neuroscience

    Science.gov (United States)

    Colvin, Robert

    2016-07-01

    Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.

  8. Generalised pruritus as a presentation of Grave’s disease

    Directory of Open Access Journals (Sweden)

    Tan CE

    2013-05-01

    Full Text Available Pruritus is a lesser known symptom of hyperthyroidism, particularly in autoimmune thyroid disorders. This is a case report of a 27-year-old woman who presented with generalised pruritus at a primary care clinic. Incidental findings of tachycardia and a goiter led to the investigations of her thyroid status. The thyroid function test revealed elevated serum free T4 and suppressed thyroid stimulating hormone levels. The anti-thyroid antibodies were positive. She was diagnosed with Graves’ disease and treated with carbimazole until her symptoms subsided. Graves’ disease should be considered as an underlying cause for patients presenting with pruritus. A thorough history and complete physical examination are crucial in making an accurate diagnosis. Underlying causes must be determined before treating the symptoms.

  9. Generalised two target localisation using passive monopulse radar

    KAUST Repository

    Jardak, Seifallah

    2017-04-07

    The simultaneous lobing technique, also known as monopulse technique, has been widely used for fast target localisation and tracking purposes. Many works focused on accurately localising one or two targets lying within a narrow beam centred around the monopulse antenna boresight. In this study, a new approach is proposed, which uses the outputs of four antennas to rapidly localise two point targets present in the hemisphere. If both targets have the same elevation angle, the proposed scheme cannot detect them. To detect such targets, a second set of antennas is required. In this study, to detect two targets at generalised locations, the antenna array is divided into multiple overlapping sets each of four antennas. Two algorithms are proposed to combine the outputs from multiple sets and improve the detection performance. Simulation results show that the algorithm is able to localise both targets with <;2° mean square error in azimuth and elevation.

  10. Visceral obesity and psychosocial stress: a generalised control theory model

    Science.gov (United States)

    Wallace, Rodrick

    2016-07-01

    The linking of control theory and information theory via the Data Rate Theorem and its generalisations allows for construction of necessary conditions statistical models of body mass regulation in the context of interaction with a complex dynamic environment. By focusing on the stress-related induction of central obesity via failure of HPA axis regulation, we explore implications for strategies of prevention and treatment. It rapidly becomes evident that individual-centred biomedical reductionism is an inadequate paradigm. Without mitigation of HPA axis or related dysfunctions arising from social pathologies of power imbalance, economic insecurity, and so on, it is unlikely that permanent changes in visceral obesity for individuals can be maintained without constant therapeutic effort, an expensive - and likely unsustainable - public policy.

  11. The second critical density and anisotropic generalised condensation

    Directory of Open Access Journals (Sweden)

    M. Beau

    2010-01-01

    Full Text Available In this letter we discuss the relevance of the 3D Perfect Bose gas (PBG condensation in extremely elongated vessels for the study of anisotropic condensate coherence and the "quasi-condensate". To this end we analyze the case of exponentially anisotropic (van den Berg boxes, when there are two critical densities ρc<ρm for a generalised Bose-Einstein Condensation (BEC. Here ρc is the standard critical density for the PBG. We consider three examples of anisotropic geometry: slabs, squared beams and "cigars" to demonstrate that the "quasi-condensate" which exists in domain ρc<ρ<ρm is in fact the van den Berg-Lewis-Pulé generalised condensation (vdBLP-GC of the type III with no macroscopic occupation of any mode. We show that for the slab geometry the second critical density ρm is a threshold between quasi-two-dimensional (quasi-2D condensate and the three dimensional (3D regime when there is a coexistence of the "quasi-condensate" with the standard one-mode BEC. On the other hand, in the case of squared beams and "cigars" geometries, critical density ρm separates quasi-1D and 3D regimes. We calculate the value of the difference between ρc, ρm (and between corresponding critical temperatures Tm, Tc to show that the observed space anisotropy of the condensate coherence can be described by a critical exponent γ(T related to the anisotropic ODLRO. We compare our calculations with physical results for extremely elongated traps that manifest "quasi-condensate".

  12. Generalised partition functions: inferences on phase space distributions

    Directory of Open Access Journals (Sweden)

    R. A. Treumann

    2016-06-01

    Full Text Available It is demonstrated that the statistical mechanical partition function can be used to construct various different forms of phase space distributions. This indicates that its structure is not restricted to the Gibbs–Boltzmann factor prescription which is based on counting statistics. With the widely used replacement of the Boltzmann factor by a generalised Lorentzian (also known as the q-deformed exponential function, where κ = 1∕|q − 1|, with κ, q ∈ R both the kappa-Bose and kappa-Fermi partition functions are obtained in quite a straightforward way, from which the conventional Bose and Fermi distributions follow for κ → ∞. For κ ≠ ∞ these are subject to the restrictions that they can be used only at temperatures far from zero. They thus, as shown earlier, have little value for quantum physics. This is reasonable, because physical κ systems imply strong correlations which are absent at zero temperature where apart from stochastics all dynamical interactions are frozen. In the classical large temperature limit one obtains physically reasonable κ distributions which depend on energy respectively momentum as well as on chemical potential. Looking for other functional dependencies, we examine Bessel functions whether they can be used for obtaining valid distributions. Again and for the same reason, no Fermi and Bose distributions exist in the low temperature limit. However, a classical Bessel–Boltzmann distribution can be constructed which is a Bessel-modified Lorentzian distribution. Whether it makes any physical sense remains an open question. This is not investigated here. The choice of Bessel functions is motivated solely by their convergence properties and not by reference to any physical demands. This result suggests that the Gibbs–Boltzmann partition function is fundamental not only to Gibbs–Boltzmann but also to a large class of generalised Lorentzian distributions as well as to the

  13. An anisotropic elastoplastic constitutive formulation generalised for orthotropic materials

    Science.gov (United States)

    Mohd Nor, M. K.; Ma'at, N.; Ho, C. S.

    2018-03-01

    This paper presents a finite strain constitutive model to predict a complex elastoplastic deformation behaviour that involves very high pressures and shockwaves in orthotropic materials using an anisotropic Hill's yield criterion by means of the evolving structural tensors. The yield surface of this hyperelastic-plastic constitutive model is aligned uniquely within the principal stress space due to the combination of Mandel stress tensor and a new generalised orthotropic pressure. The formulation is developed in the isoclinic configuration and allows for a unique treatment for elastic and plastic orthotropy. An isotropic hardening is adopted to define the evolution of plastic orthotropy. The important feature of the proposed hyperelastic-plastic constitutive model is the introduction of anisotropic effect in the Mie-Gruneisen equation of state (EOS). The formulation is further combined with Grady spall failure model to predict spall failure in the materials. The proposed constitutive model is implemented as a new material model in the Lawrence Livermore National Laboratory (LLNL)-DYNA3D code of UTHM's version, named Material Type 92 (Mat92). The combination of the proposed stress tensor decomposition and the Mie-Gruneisen EOS requires some modifications in the code to reflect the formulation of the generalised orthotropic pressure. The validation approach is also presented in this paper for guidance purpose. The \\varvec{ψ} tensor used to define the alignment of the adopted yield surface is first validated. This is continued with an internal validation related to elastic isotropic, elastic orthotropic and elastic-plastic orthotropic of the proposed formulation before a comparison against range of plate impact test data at 234, 450 and {895 ms}^{-1} impact velocities is performed. A good agreement is obtained in each test.

  14. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  15. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-04-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory.

  16. Low-level radioactive waste source term model development and testing: Topical report

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Kempf, C.R.; Suen, C.J.; Mughabghab, S.M.

    1988-08-01

    The Low-Level Waste Source Term Evaluation Project has the objective to develop a system model capable of predicting radionuclide release rates from a shallow land burial facility. The previous topical report for this project discussed the framework and methodology for developing a system model and divided the problem into four compartments: water flow, container degradation, waste form leaching, and radionuclide transport. Each of these compartments is described by submodels which will be coupled into the system model. From February 1987 to March 1988, computer models have been selected to predict water flow (FEMWATER) and radionuclide transport (FEMWASTE) and separate models have been developed to predict pitting corrosion of steel containers and leaching from porous waste forms contained in corrodible containers. This report discusses each of the models in detail and presents results obtained from applying the models to shallow land burial trenches over a range of expected conditions. 68 refs., 34 figs., 14 tabs

  17. A Mechanistic Source Term Calculation for a Metal Fuel Sodium Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2017-06-26

    A mechanistic source term (MST) calculation attempts to realistically assess the transport and release of radionuclides from a reactor system to the environment during a specific accident sequence. The U.S. Nuclear Regulatory Commission (NRC) has repeatedly stated its expectation that advanced reactor vendors will utilize an MST during the U.S. reactor licensing process. As part of a project to examine possible impediments to sodium fast reactor (SFR) licensing in the U.S., an analysis was conducted regarding the current capabilities to perform an MST for a metal fuel SFR. The purpose of the project was to identify and prioritize any gaps in current computational tools, and the associated database, for the accurate assessment of an MST. The results of the study demonstrate that an SFR MST is possible with current tools and data, but several gaps exist that may lead to possibly unacceptable levels of uncertainty, depending on the goals of the MST analysis.

  18. Improved thermal source term generation capability for use in performance assessment and system studies

    International Nuclear Information System (INIS)

    King, J.; Rhodes, C.

    1994-01-01

    This paper describes work performed by the Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor to improve spent nuclear fuel (SNF) waste stream characterization for system studies. It discusses how these new capabilities may be exploited for thermal source term generation for use in repository performance assessment modeling. SNF historical discharges have been exhaustively tracked, and significant effort has gone into capturing, verifying, and electronically managing spent fuel inventory data. Future discharge projections are produced annually by the Energy Information Administration (EIA) using sophisticated computer models. The output of these models is coupled with annually updated SNF historical discharges to produce what is referred to as the open-quotes reactor database.close quotes This database and related data are published in a variety of ways including on magnetic media for consistent use by analysts or other interested parties

  19. Source-term Determination and Dose-rates Evaluation For RTP Fuel Transfer Cask Design

    International Nuclear Information System (INIS)

    Mohamad Hairie Rabir

    2016-01-01

    This paper describes MCNPX calculations of the gamma-ray doses rates arising from two TRIGA spent-fuel transfer casks candidate for PUSPATI TRIGA Reactor (RTP). Calculations of radiation fields at several points along the axial length of the casks are reported. These external dose rates are reported for both primary gamma rays (arising from fission and activation products in the fuel). Photon shielding calculations and the source term were evaluated by the MCNPX. From the calculation, it was found that the maximum surface dose rates of the fuel transfer cask with the two candidates (41 cm and 55 cm in diameter) were estimated within the limit (2 mSv/ hr). (author)

  20. Methods to prevent the source term of methyl lodide during a core melt accident

    Energy Technology Data Exchange (ETDEWEB)

    Karhu, A. [VTT Energy (Finland)

    1999-11-01

    The purpose of this literature review is to gather available information of the methods to prevent a source term of methyl iodide during a core melt accident. The most widely studied methods for nuclear power plants include the impregnated carbon filters and alkaline additives and sprays. It is indicated that some deficiencies of these methods may emerge. More reactive impregnants and additives could make a great improvement. As a new method in the field of nuclear applications, the potential of transition metals to decompose methyl iodide, is introduced in this review. This area would require an additional research, which could elucidate the remaining questions of the reactions. The ionization of the gaseous methyl iodide by corona-discharge reactors is also shortly described. (au)

  1. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    International Nuclear Information System (INIS)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-01-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory

  2. A source term estimation method for a nuclear accident using atmospheric dispersion models

    DEFF Research Database (Denmark)

    Kim, Minsik; Ohba, Ryohji; Oura, Masamichi

    2015-01-01

    The objective of this study is to develop an operational source term estimation (STE) method applicable for a nuclear accident like the incident that occurred at the Fukushima Dai-ichi nuclear power station in 2011. The new STE method presented here is based on data from atmospheric dispersion...... models and short-range observational data around the nuclear power plants.The accuracy of this method is validated with data from a wind tunnel study that involved a tracer gas release from a scaled model experiment at Tokai Daini nuclear power station in Japan. We then use the methodology developed...... and validated through the effort described in this manuscript to estimate the release rate of radioactive material from the Fukushima Dai-ichi nuclear power station....

  3. Radiological consequence evaluation of DBAs with alternative source term method for a Chinese PWR

    International Nuclear Information System (INIS)

    Li, J.X.; Cao, X.W.; Tong, L.L.; Huang, G.F.

    2012-01-01

    Highlights: ► Radiological consequence evaluation of DBAs with alternative source term method for a Chinese 900 MWe PWR has been investigated. ► Six typical DBA sequences are analyzed. ► The doses of control room, EAB and outer boundary of LPZ are acceptable. ► The differences between AST method and TID-14844 method are investigated. - Abstract: Since a large amount of fission products may releases into the environment, during the accident progression in nuclear power plants (NPPs), which is a potential hazard to public risk, the radiological consequence should be evaluated for alleviating the hazard. In most Chinese NPPs the method of TID-14844, in which the whole body and thyroid dose criteria is employed as dose criteria, is currently adopted to evaluate the radiological consequences for design-basis accidents (DBAs), but, due to the total effective dose equivalent is employed as dose criteria in alternative radiological source terms (AST) method, it is necessary to evaluate the radiological consequences for DBAs with AST method and to discuss the difference between two methods. By using an integral safety analysis code, an analytical model of the 900 MWe pressurized water reactor (PWR) is built and the radiological consequences in DBAs at control room (CR), exclusion area boundary (EAB), low population zone (LPZ) are analyzed, which includes LOCA and non-LOCA DBAs, such as fuel handling accident (FHA), rod ejection accident (REA), main steam line break (MSLB), steam generator tube rupture (SGTR), locked rotor accident (LRA) by using the guidance of the RG 1.183. The results show that the doses in CR, EAB and LPZ are acceptable compared with dose criteria in RG 1.183 and the differences between AST method and TID-14844 method are also discussed.

  4. An artificial neural network approach to reconstruct the source term of a nuclear accident

    International Nuclear Information System (INIS)

    Giles, J.; Palma, C. R.; Weller, P.

    1997-01-01

    This work makes use of one of the main features of artificial neural networks, which is their ability to 'learn' from sets of known input and output data. Indeed, a trained artificial neural network can be used to make predictions on the input data when the output is known, and this feedback process enables one to reconstruct the source term from field observations. With this aim, an artificial neural networks has been trained, using the projections of a segmented plume atmospheric dispersion model at fixed points, simulating a set of gamma detectors located outside the perimeter of a nuclear facility. The resulting set of artificial neural networks was used to determine the release fraction and rate for each of the noble gases, iodines and particulate fission products that could originate from a nuclear accident. Model projections were made using a large data set consisting of effective release height, release fraction of noble gases, iodines and particulate fission products, atmospheric stability, wind speed and wind direction. The model computed nuclide-specific gamma dose rates. The locations of the detectors were chosen taking into account both building shine and wake effects, and varied in distance between 800 and 1200 m from the reactor.The inputs to the artificial neural networks consisted of the measurements from the detector array, atmospheric stability, wind speed and wind direction; the outputs comprised a set of release fractions and heights. Once trained, the artificial neural networks was used to reconstruct the source term from the detector responses for data sets not used in training. The preliminary results are encouraging and show that the noble gases and particulate fission product release fractions are well determined

  5. Source term estimation for small sized HTRs: status and further needs - a german approach

    International Nuclear Information System (INIS)

    Moormann, R.; Schenk, W.; Verfondern, K.

    2000-01-01

    The main results of German studies on source term estimation for small pebble-bed HTRs with their strict safety demands are outlined. Core heat-up events are no longer dominant for modern high quality fuel, but fission product transport during water ingress accidents (steam cycle plants) and depressurization is relevant, mainly due to remobilization of fission products which were plated-out in the course of normal operation or became dust borne. An important lack of knowledge was identified as concerns data on plate-out under normal operation, as well as on the behaviour of dust borne activity as a whole. Improved knowledge in this field is also important for maintenance/repair and design/shielding. For core heat-up events the influence of burn-up on temperature induced fission product release has to be measured for future high burn-up fuel. Also, transport mechanisms out of the He circuit into the environment require further examination. For water/steam ingress events mobilization of plated-out fission products by steam or water has to be considered in detail, along with steam interaction with kernels of particles with defective coatings. For source terms of depressurization, a more detailed knowledge of the flow pattern and shear forces on the various surfaces is necessary. In order to improve the knowledge on plate-out and dust in normal operation and to generate specimens for experimental remobilization studies, planning/design of plate-out/dust examination facilities which could be added to the next generation of HTRs (HTR10,HTTR) is proposed. For severe air ingress and reactivity accidents, behaviour of future advanced fuel elements has to be experimentally tested. (authors)

  6. Source term and activation calculations for the new TR-FLEX cyclotron for medical applications at HZDR

    Energy Technology Data Exchange (ETDEWEB)

    Konheiser, Joerg [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Reactor Safety; Ferrari, A. [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Inst. of Radiation Physics; Magin, A. [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany); Naumann, B. [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Dept. of Radiation Protection and Safety; Mueller, S.E.

    2017-06-01

    The neutron source terms for a proton beam hitting an {sup 18}O-enriched water target were calculated with the radiation transport programs MCNP6 and FLUKA and were compared to source terms for exclusive {sup 18}O(p,n){sup 18}F production. To validate the radiation fields obtained in the simulations, an experimental program has been started using activation samples originally used in reactor dosimetry.

  7. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  8. Long-term monitoring on environmental disasters using multi-source remote sensing technique

    Science.gov (United States)

    Kuo, Y. C.; Chen, C. F.

    2017-12-01

    Environmental disasters are extreme events within the earth's system that cause deaths and injuries to humans, as well as causing damages and losses of valuable assets, such as buildings, communication systems, farmlands, forest and etc. In disaster management, a large amount of multi-temporal spatial data is required. Multi-source remote sensing data with different spatial, spectral and temporal resolutions is widely applied on environmental disaster monitoring. With multi-source and multi-temporal high resolution images, we conduct rapid, systematic and seriate observations regarding to economic damages and environmental disasters on earth. It is based on three monitoring platforms: remote sensing, UAS (Unmanned Aircraft Systems) and ground investigation. The advantages of using UAS technology include great mobility and availability in real-time rapid and more flexible weather conditions. The system can produce long-term spatial distribution information from environmental disasters, obtaining high-resolution remote sensing data and field verification data in key monitoring areas. It also supports the prevention and control on ocean pollutions, illegally disposed wastes and pine pests in different scales. Meanwhile, digital photogrammetry can be applied on the camera inside and outside the position parameters to produce Digital Surface Model (DSM) data. The latest terrain environment information is simulated by using DSM data, and can be used as references in disaster recovery in the future.

  9. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    International Nuclear Information System (INIS)

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ''MELCOR Verification, Benchmarking, and Applications,'' whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR

  10. Effect of Fuel Structure Materials on Radiation Source Term in Reactor Core Meltdown

    International Nuclear Information System (INIS)

    Jeong, Hae Sun; Ha, Kwang Soon

    2014-01-01

    The fission product (Radiation Source) releases from the reactor core into the containment is obligatorily evaluated to guarantee the safety of Nuclear Power Plant (NPP) under the hypothetical accident involving a core meltdown. The initial core inventory is used as a starting point of all radiological consequences and effects on the subsequent results of accident assessment. Hence, a proper evaluation for the inventory can be regarded as one of the most important part over the entire procedure of accident analysis. The inventory of fission products is typically evaluated on the basis of the uranium material (e.g., UO2 and USi2) loaded in nuclear fuel assembly, except for the structure materials such as the end fittings, grids, and some kinds of springs. However, the structure materials are continually activated by the neutrons generated from the nuclear fission, and some nuclides of them (e.g., 14 C and 60 Co) can significantly influence on accident assessment. During the severe core accident, the structure components can be also melted with the melting points of temperature relatively lower than uranium material. A series of the calculation were performed by using ORIGEN-S module in SCALE 6.1 package code system. The total activity in each part of structure materials was specifically analyzed from these calculations. The fission product inventory is generally evaluated based on the uranium materials of fuel only, even though the structure components of the assembly are continually activated by the neutrons generated from the nuclear fission. In this study, the activation calculation of the fuel structure materials was performed for the initial source term assessment in the accident of reactor core meltdown. As a result, the lower end fitting and the upper plenum greatly contribute to the total activity except for the cladding material. The nuclides of 56 Mn, '5 1 Cr, 55 Fe, 58 Co, 54 Mn, and 60 Co are analyzed to mainly effect on the activity. This result

  11. A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III(registered trademark): IC4

    Science.gov (United States)

    2017-06-07

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--17-9726 Approved for public release; distribution is unlimited. A Source ...reviewing instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing this collection of... Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III®: IC4 Clarence O. Collins III and W. Erick Rogers Naval Research Laboratory Oceanography

  12. Long-term Satellite Observations of Asian Dust Storm: Source, Pathway, and Interannual Variability

    Science.gov (United States)

    Hsu, N. Christina

    2008-01-01

    between Deep Blue retrievals of aerosol optical thickness and those directly from AERONET sunphotometers over desert and semi-desert regions. New Deep Blue products will allow scientists to determine quantitatively the aerosol properties near sources using high spatial resolution measurements from SeaWiFS and MODIS-like instruments. Long-term satellite measurements (1998 - 2007) from SeaWiFS will be utilized to investigate the interannual variability of source, pathway, and dust loading associated with the Asian dust storm outbreaks. In addition, monthly averaged aerosol optical thickness during the springtime from SeaWiFS will also be compared with the MODIS Deep Blue products.

  13. Classical and quantum parts in Madelung variables: Splitting the source term of the Einstein equation into classical and quantum parts

    Directory of Open Access Journals (Sweden)

    Biró T.S.

    2014-01-01

    Full Text Available Postulating a particular quantum correction to the source term in the classical Einstein equation we identify the conformal content of the above action and obtain classical gravitation for massive particles, but with a cosmological term representing off-mass-shell contribution to the energy-momentum tensor.

  14. A Generalisation, a Simplification and some Applications of Paillier's Probabilistic Public-Key System

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Jurik, Mads Johan

    2001-01-01

    We propose a generalisation of Paillier's probabilistic public key system, in which the expansion factor is reduced and which allows to adjust the block length of the scheme even after the public key has been fixed, without loosing the homomorphic property. We show that the generalisation is as s...

  15. Short-term X-ray variability of the globular cluster source 4U 1820 - 30 (NGC 6624)

    Science.gov (United States)

    Stella, L.; Kahn, S. M.; Grindlay, J. E.

    1984-01-01

    Analytical techniques for improved identification of the temporal and spectral variability properties of globular cluster and galactic bulge X-ray sources are described in terms of their application to a large set of observations of the source 4U 1820 - 30 in the globular cluster NGC 6624. The autocorrelation function, cross-correlations, time skewness function, erratic periodicities, and pulse trains are examined. The results are discussed in terms of current models with particular emphasis on recent accretion disk models. It is concluded that the analyzed observations provide the first evidence for shot-noise variability in a globular cluster X-ray source.

  16. Fission Product Transport and Source Terms in HTRs: Experience from AVR Pebble Bed Reactor

    Directory of Open Access Journals (Sweden)

    Rainer Moormann

    2008-01-01

    Full Text Available Fission products deposited in the coolant circuit outside of the active core play a dominant role in source term estimations for advanced small pebble bed HTRs, particularly in design basis accidents (DBA. The deposited fission products may be released in depressurization accidents because present pebble bed HTR concepts abstain from a gas tight containment. Contamination of the circuit also hinders maintenance work. Experiments, performed from 1972 to 88 on the AVR, an experimental pebble bed HTR, allow for a deeper insight into fission product transport behavior. The activity deposition per coolant pass was lower than expected and was influenced by fission product chemistry and by presence of carbonaceous dust. The latter lead also to inconsistencies between Cs plate out experiments in laboratory and in AVR. The deposition behavior of Ag was in line with present models. Dust as activity carrier is of safety relevance because of its mobility and of its sorption capability for fission products. All metal surfaces in pebble bed reactors were covered by a carbonaceous dust layer. Dust in AVR was produced by abrasion in amounts of about 5 kg/y. Additional dust sources in AVR were ours oil ingress and peeling of fuel element surfaces due to an air ingress. Dust has a size of about 1  m, consists mainly of graphite, is partly remobilized by flow perturbations, and deposits with time constants of 1 to 2 hours. In future reactors, an efficient filtering via a gas tight containment is required because accidents with fast depressurizations induce dust mobilization. Enhanced core temperatures in normal operation as in AVR and broken fuel pebbles have to be considered, as inflammable dust concentrations in the gas phase.

  17. Revision of earthquake hypocentre locations in global bulletin data sets using source-specific station terms

    Science.gov (United States)

    Nooshiri, Nima; Saul, Joachim; Heimann, Sebastian; Tilmann, Frederik; Dahm, Torsten

    2017-02-01

    Global earthquake locations are often associated with very large systematic travel-time residuals even for clear arrivals, especially for regional and near-regional stations in subduction zones because of their strongly heterogeneous velocity structure. Travel-time corrections can drastically reduce travel-time residuals at regional stations and, in consequence, improve the relative location accuracy. We have extended the shrinking-box source-specific station terms technique to regional and teleseismic distances and adopted the algorithm for probabilistic, nonlinear, global-search location. We evaluated the potential of the method to compute precise relative hypocentre locations on a global scale. The method has been applied to two specific test regions using existing P- and pP-phase picks. The first data set consists of 3103 events along the Chilean margin and the second one comprises 1680 earthquakes in the Tonga-Fiji subduction zone. Pick data were obtained from the GEOFON earthquake bulletin, produced using data from all available, global station networks. A set of timing corrections varying as a function of source position was calculated for each seismic station. In this way, we could correct the systematic errors introduced into the locations by the inaccuracies in the assumed velocity structure without explicitly solving for a velocity model. Residual statistics show that the median absolute deviation of the travel-time residuals is reduced by 40-60 per cent at regional distances, where the velocity anomalies are strong. Moreover, the spread of the travel-time residuals decreased by ˜20 per cent at teleseismic distances (>28°). Furthermore, strong variations in initial residuals as a function of recording distance are smoothed out in the final residuals. The relocated catalogues exhibit less scattered locations in depth and sharper images of the seismicity associated with the subducting slabs. Comparison with a high-resolution local catalogue reveals that

  18. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  19. Long-term change in the source contribution to surface ozone over Japan

    Science.gov (United States)

    Nagashima, Tatsuya; Sudo, Kengo; Akimoto, Hajime; Kurokawa, Junichi; Ohara, Toshimasa

    2017-07-01

    The relative contributions of various source regions to the long-term (1980-2005) increasing trend in surface ozone (O3) over Japan were estimated by a series of tracer-tagging simulations using a global chemical transport model. The model simulated the observed increasing trend in surface O3, including its seasonal variation and geographical features, in Japan well and demonstrated the relative roles of different source regions in forming this trend. Most of the increasing trend in surface O3 over Japan ( ˜ 97 %) that was simulated was explained as the sum of trends in contributions of different regions to photochemical O3 production. The increasing trend in O3 produced in China accounted for 36 % of the total increasing trend and those in the other northeast Asian regions (the Korean Peninsula, coastal regions in East Asia, and Japan) each accounted for about 12-15 %. Furthermore, the contributions of O3 created in the entire free troposphere and in western, southern, and southeastern Asian regions also increased, and their increasing trends accounted for 16 and 7 % of the total trend, respectively. The impact of interannual variations in climate, in methane concentration, and in emission of O3 precursors from different source regions on the relative contributions of O3 created in each region estimated above was also investigated. The variation of climate and the increase in methane concentration together caused the increase of photochemical O3 production in several regions, and represented about 19 % of the total increasing trend in surface O3 over Japan. The increase in emission of O3 precursors in China caused an increase of photochemical O3 production not only in China itself but also in the other northeast Asian regions and accounted for about 46 % of the total increase in surface O3 over Japan. Similarly, the relative impact of O3 precursor emission changes in the Korean Peninsula and Japan were estimated as about 16 and 4 % of the total increasing trend

  20. Major Differences of in-Containment source term behaviour between LWRs and LMBRs

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L. E.; Garcia, M.

    2010-07-01

    The characterization and behaviour of in-containment nuclear aerosols and fission products in case of a postulated accident is of fundamental importance for assessing the radiological consequences and for setting up filtering systems and even reactor components. Even though there are some commonalities regarding accident scenarios in different reactor types, there are major differences that would affect drastically source term to the environment. This is particularly true when Light Water Reactors (LWRs) and Liquid Metal Fast Breeder Reactors (LMFBRs) are considered. Nuclear safety studies of LWRs has traditionally given a huge emphasis to Loss Of Coolant Accidents (LOCAs). The reason is the high pressure set in the coolant system and the large amount of complex phenomena involved in case of the system depressurization. Given the low pressure LMFBRs coolant system and the own reactor architecture, particularly in Sodium Fast Reactors (SFRs), attention has been focused on highly unlikely scenarios identified as Hypothetical Core Disruptive Accidents (HCDAs). In this type of accidents, sodium voiding and fuel relocation lead to high reactivity insertion with substantial energy release, leading to severe core damage, failure of the reactor pressure vessel and sodium-concrete interactions.

  1. Long-term effects of lead poisoning on bone mineralization in vultures exposed to ammunition sources

    Energy Technology Data Exchange (ETDEWEB)

    Gangoso, Laura [Department of Conservation Biology, Estacion Biologica de Donana, C.S.I.C., Avda Ma Luisa s/n, 41013 Sevilla (Spain)], E-mail: laurag@ebd.csic.es; Alvarez-Lloret, Pedro [Department of Mineralogy and Petrology, University of Granada, Avda Fuentenueva s/n, 18002 Granada (Spain)], E-mail: pedalv@ugr.es; Rodriguez-Navarro, Alejandro A.B. [Department of Mineralogy and Petrology, University of Granada, Avda Fuentenueva s/n, 18002 Granada (Spain)], E-mail: anava@ugr.es; Mateo, Rafael [Instituto de Investigacion en Recursos Cinegeticos, IREC (CSIC, UCLM, JCCM), Ronda de Toledo s/n, 13071 Ciudad Real (Spain)], E-mail: Rafael.Mateo@uclm.es; Hiraldo, Fernando [Department of Conservation Biology, Estacion Biologica de Donana, C.S.I.C., Avda Ma Luisa s/n, 41013 Sevilla (Spain)], E-mail: hiraldo@ebd.csic.es; Donazar, Jose Antonio [Department of Conservation Biology, Estacion Biologica de Donana, C.S.I.C., Avda Ma Luisa s/n, 41013 Sevilla (Spain)], E-mail: donazar@ebd.csic.es

    2009-02-15

    Long-lived species are particularly susceptible to bioaccumulation of lead in bone tissues. In this paper we gain insights into the sublethal effects of lead contamination on Egyptian vultures (Neophron percnopterus). Our approach was done on the comparison of two populations (Canary Islands and Iberian Peninsula) differing in exposures to the ingestion of lead ammunition. Blood lead levels were higher in the island population (Canary Islands range: 5.10-1780 {mu}g L{sup -1}n = 137; Iberian Peninsula range: 5.60-217.30 {mu}g L{sup -1}n = 32) showing clear seasonal trends, peaking during the hunting season. Moreover, males were more susceptible to lead accumulation than females. Bone lead concentration increased with age, reflecting a bioaccumulation effect. The bone composition was significantly altered by this contaminant: the mineralization degree decreased as lead concentration levels increased. These results demonstrate the existence of long-term effects of lead poisoning, which may be of importance in the declines of threatened populations of long-lived species exposed to this contaminant. - Bone lead accumulation decreases the degree of bone mineralization in vultures exposed to ammunition sources.

  2. Long-term effects of lead poisoning on bone mineralization in vultures exposed to ammunition sources

    International Nuclear Information System (INIS)

    Gangoso, Laura; Alvarez-Lloret, Pedro; Rodriguez-Navarro, Alejandro A.B.; Mateo, Rafael; Hiraldo, Fernando; Donazar, Jose Antonio

    2009-01-01

    Long-lived species are particularly susceptible to bioaccumulation of lead in bone tissues. In this paper we gain insights into the sublethal effects of lead contamination on Egyptian vultures (Neophron percnopterus). Our approach was done on the comparison of two populations (Canary Islands and Iberian Peninsula) differing in exposures to the ingestion of lead ammunition. Blood lead levels were higher in the island population (Canary Islands range: 5.10-1780 μg L -1 n = 137; Iberian Peninsula range: 5.60-217.30 μg L -1 n = 32) showing clear seasonal trends, peaking during the hunting season. Moreover, males were more susceptible to lead accumulation than females. Bone lead concentration increased with age, reflecting a bioaccumulation effect. The bone composition was significantly altered by this contaminant: the mineralization degree decreased as lead concentration levels increased. These results demonstrate the existence of long-term effects of lead poisoning, which may be of importance in the declines of threatened populations of long-lived species exposed to this contaminant. - Bone lead accumulation decreases the degree of bone mineralization in vultures exposed to ammunition sources

  3. Overview of plant specific source terms and their impact on risk

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    2004-01-01

    Probabilistic risk assesment and safety assessment focuses on systems and measures to prevent core meltdown, and it integrates many aspects of design and operation. It provides mapping of initiating event, frequencies onto plant damage state and through plant systems analysis, utilizes fault tree and event tree logic models, may include 'external event' analysis such as fire, flood, wind, seismic events. Percent contribution of sequences to the core damage frequency are shown for the following plants, taken as examples ZION, EDISON, OCONEE 3, SEABROOK, SIZEWELL B, MILLSTONE 3, RINGHALS 2. The presentation includes comparison of the following initiating event frequencies: loss of off-site power; small LOCA; large LOCA, steam generator tube rupture; loss of feedwater; turbine trip; reactor trip. Consequence analysis deals with: dispersion and depletion of radioactivity in the atmosphere, health effects, factors in the off-site emergency plan analyzed with codes that address the weather conditions; provision of mapping of source terms; risk diagram for early fatalities and for latent cancer fatalities

  4. Effect of hypoiodous acid volatility on the iodine source term in reactor accidents

    Energy Technology Data Exchange (ETDEWEB)

    Routamo, T. [Imatran Voima Oy, Vantaa (Finland)

    1996-12-01

    A FORTRAN code ACT WATCH has been developed to establish an improved understanding of essential radionuclide behaviour mechanisms, especially related to iodine chemistry, in reactor accidents. The accident scenarios calculated in this paper are based on the Loss of Coolant accident at the Loviisa Nuclear Power Plant. The effect of different airborne species, especially HIO, on the iodine source term has been studied. The main cause of the high HIO release in the system modelled is the increase of I{sub 2} hydrolysis rate along with the temperature increase, which accelerates HIO production. Due to the high radiation level near the reactor core, I{sub 2} is produced from I{sup -}very rapidly. High temperature in the reactor coolant causes I{sub 2} to be transformed into HIO and through the boiling of the coolant volatile I{sub 2} and HIO are transferred efficiently into the gas phase. High filtration efficiency for particulate iodine causes I{sup -} release to be much lower than those of I{sub 2} and HIO. (author) 15 figs., 1 tab., refs.

  5. Probabilistic Dose Assessment from SB-LOCA Accident in Ujung Lemahabang Using TMI-2 Source Term

    Directory of Open Access Journals (Sweden)

    Sunarko

    2017-01-01

    Full Text Available Probabilistic dose assessment and mapping for nuclear accident condition are performed for Ujung Lemahabang site in Muria Peninsula region in Indonesia. Source term is obtained from Three-Mile Island unit 2 (TMI-2 PWR-type SB-LOCA reactor accident inverse modeling. Effluent consisted of Xe-133, Kr-88, I-131, and Cs-137 released from a 50 m stack. Lagrangian Particle Dispersion Method (LPDM and 3-dimensional mass-consistent wind field are employed to obtain surface-level time-integrated air concentration and spatial distribution of ground-level total dose in dry condition. Site-specific meteorological data is obtained from hourly records obtained during the Site Feasibility Study period in Ujung Lemahabang. Effluent is released from a height of 50 meters in uniform rate during a 6-hour period and the dose is integrated during this period in a neutrally stable atmospheric condition. Maximum dose noted is below regulatory limit of 1 mSv and radioactive plume is spread mostly to the W-SW inland and to N-NE from the proposed plant to Java Sea. This paper has demonstrated for the first time a probabilistic analysis method for assessing possible spatial dose distribution, a hypothetical release, and a set of meteorological data for Ujung Lemahabang region.

  6. Effect of hypoiodous acid volatility on the iodine source term in reactor accidents

    International Nuclear Information System (INIS)

    Routamo, T.

    1996-01-01

    A FORTRAN code ACT WATCH has been developed to establish an improved understanding of essential radionuclide behaviour mechanisms, especially related to iodine chemistry, in reactor accidents. The accident scenarios calculated in this paper are based on the Loss of Coolant accident at the Loviisa Nuclear Power Plant. The effect of different airborne species, especially HIO, on the iodine source term has been studied. The main cause of the high HIO release in the system modelled is the increase of I 2 hydrolysis rate along with the temperature increase, which accelerates HIO production. Due to the high radiation level near the reactor core, I 2 is produced from I - very rapidly. High temperature in the reactor coolant causes I 2 to be transformed into HIO and through the boiling of the coolant volatile I 2 and HIO are transferred efficiently into the gas phase. High filtration efficiency for particulate iodine causes I - release to be much lower than those of I 2 and HIO. (author) 15 figs., 1 tab., refs

  7. Source Term Analysis of the Irradiated Graphite in the Core of HTR-10

    Directory of Open Access Journals (Sweden)

    Xuegang Liu

    2017-01-01

    Full Text Available The high temperature gas-cooled reactor (HTGR has potential utilization due to its featured characteristics such as inherent safety and wide diversity of utilization. One distinct difference between HTGR and traditional pressurized water reactor (PWR is the large inventory of graphite in the core acting as reflector, moderator, or structure materials. Some radionuclides will be generated in graphite during the period of irradiation, which play significant roles in reactor safety, environmental release, waste disposal, and so forth. Based on the actual operation of the 10 MW pebble bed high temperature gas-cooled reactor (HTR-10 in Tsinghua University, China, an experimental study on source term analysis of the irradiated graphite has been done. An irradiated graphite sphere was randomly collected from the core of HTR-10 as sample in this study. This paper focuses on the analytical procedure and the establishment of the analytical methodology, including the sample collection, graphite sample preparation, and analytical parameters. The results reveal that the Co-60, Cs-137, Eu-152, and Eu-154 are the major γ contributors, while H-3 and C-14 are the dominating β emitting nuclides in postirradiation graphite material of HTR-10. The distribution profiles of the above four nuclides are also presented.

  8. LMFBR source term experiments in the Fuel Aerosol Simulant Test (FAST) facility

    Energy Technology Data Exchange (ETDEWEB)

    Petrykowski, J.C.; Longest, A.W.

    1985-01-01

    The transport of uranium dioxide (UO/sub 2/) aerosol through liquid sodium was studied in a series of ten experiments in the Fuel Aerosol Simulant Test (FAST) facility at Oak Ridge National Laboratory (ORNL). The experiments were designed to provide a mechanistic basis for evaluating the radiological source term associated with a postulated, energetic core disruptive accident (CDA) in a liquid metal fast breeder reactor (LMFBR). Aerosol was generated by capacitor discharge vaporization of UO/sub 2/ pellets which were submerged in a sodium pool under an argon cover gas. Measurements of the pool and cover gas pressures were used to study the transport of aerosol contained by vapor bubbles within the pool. Samples of cover gas were filtered to determine the quantity of aerosol released from the pool. The depth at which the aerosol was generated was found to be the most critical parameter affecting release. The largest release was observed in the baseline experiment where the sample was vaporized above the sodium pool. In the nine ''undersodium'' experiments aerosol was generated beneath the surface of the pool at depths varying from 30 to 1060 mm. The mass of aerosol released from the pool was found to be a very small fraction of the original specimen. It appears that the bulk of aerosol was contained by bubbles which collapsed within the pool. 18 refs., 11 figs., 4 tabs.

  9. Superstatistical generalised Langevin equation: non-Gaussian viscoelastic anomalous diffusion

    Science.gov (United States)

    Ślęzak, Jakub; Metzler, Ralf; Magdziarz, Marcin

    2018-02-01

    Recent advances in single particle tracking and supercomputing techniques demonstrate the emergence of normal or anomalous, viscoelastic diffusion in conjunction with non-Gaussian distributions in soft, biological, and active matter systems. We here formulate a stochastic model based on a generalised Langevin equation in which non-Gaussian shapes of the probability density function and normal or anomalous diffusion have a common origin, namely a random parametrisation of the stochastic force. We perform a detailed analysis demonstrating how various types of parameter distributions for the memory kernel result in exponential, power law, or power-log law tails of the memory functions. The studied system is also shown to exhibit a further unusual property: the velocity has a Gaussian one point probability density but non-Gaussian joint distributions. This behaviour is reflected in the relaxation from a Gaussian to a non-Gaussian distribution observed for the position variable. We show that our theoretical results are in excellent agreement with stochastic simulations.

  10. Interpretation of human pointing by African elephants: generalisation and rationality.

    Science.gov (United States)

    Smet, Anna F; Byrne, Richard W

    2014-11-01

    Factors influencing the abilities of different animals to use cooperative social cues from humans are still unclear, in spite of long-standing interest in the topic. One of the few species that have been found successful at using human pointing is the African elephant (Loxodonta africana); despite few opportunities for learning about pointing, elephants follow a pointing gesture in an object-choice task, even when the pointing signal and experimenter's body position are in conflict, and when the gesture itself is visually subtle. Here, we show that the success of captive African elephants at using human pointing is not restricted to situations where the pointing signal is sustained until the time of choice: elephants followed human pointing even when the pointing gesture was withdrawn before they had responded to it. Furthermore, elephants rapidly generalised their response to a type of social cue they were unlikely to have seen before: pointing with the foot. However, unlike young children, they showed no sign of evaluating the 'rationality' of this novel pointing gesture according to its visual context: that is, whether the experimenter's hands were occupied or not.

  11. Effects of Community African Drumming on Generalised Anxiety in Teenagers

    Directory of Open Access Journals (Sweden)

    David Akombo

    2013-07-01

    Full Text Available The purpose of this study was to test the effects of community music projects (CMPs, such as after-school African drumming circles, on academic performance and generalised anxiety in adolescents. Adolescents from a Junior High (7th, 8th, and 9th graders, age range from 12-14 in the State of Utah (USA participated in the study. A one-sample t-test found a significant difference in reading scores (df(4 p=.004. A paired samples t-test found a significant relationship between the maths trait anxiety score pre-intervention and the total state anxiety score pre-test (df(4 p=.033. A paired samples t-test found a significant relationship between the reading trait anxiety score post-intervention and the total state anxiety score post-test (df(4 p=.030. This research demonstrates the effectiveness of community music such as drumming for reducing anxiety and also for improving academic performance in adolescents. CMPs are recommended as a non-invasive intervention modality for adolescents.

  12. Generalised block bootstrap and its use in meteorology

    Directory of Open Access Journals (Sweden)

    L. Varga

    2017-06-01

    Full Text Available In an earlier paper, Rakonczai et al.(2014 emphasised the importance of investigating the effective sample size in case of autocorrelated data. The simulations were based on the block bootstrap methodology. However, the discreteness of the usual block size did not allow for exact calculations. In this paper we propose a new generalisation of the block bootstrap methodology, which allows for any positive real number as expected block size. We relate it to the existing optimisation procedures and apply it to a temperature data set. Our other focus is on statistical tests, where quite often the actual sample size plays an important role, even in the case of relatively large samples. This is especially the case for copulas. These are used for investigating the dependencies among data sets. As in quite a few real applications the time dependence cannot be neglected, we investigated the effect of this phenomenon on the used test statistic. The critical value can be computed by the proposed new block bootstrap simulation, where the block size is determined by fitting a VAR model to the observations. The results are illustrated for models of the used temperature data.

  13. Rare case of generalised aggressive periodontitis in the primary dentition.

    Science.gov (United States)

    Spoerri, A; Signorelli, C; Erb, J; van Waes, H; Schmidlin, P R

    2014-12-01

    Generalised aggressive periodontitis (AP) in the prepubescent age is an exceptionally rare disease in the primary dentition of otherwise healthy children. Characteristics of AP are gingival inflammation, deep periodontal pockets, bone loss, tooth mobility and even tooth loss. The most common way of treating this disease is the extraction of all the involved primary teeth. A 4-year-old girl presented with signs of severe gingival inflammation. Clinical examination revealed deep pockets, increased tooth mobility and bone loss. Microbiological testing revealed the presence of a typical periopathogenic flora consisting of Aggregatibacter actinomycetemcomitans and the typical members of the red complex (Porphyromonas gingivalis, Prevotella intermedia and Treponema denticola). The patient underwent tooth extraction of all primary teeth except the primary canines, followed by thorough root debridement and treatment with systemic antibiotics (amoxicillin plus metronidazole). Regular clinical and microbiological examinations over 4 years showed no signs of recurrence of a periodontitis, even in the erupted permanent teeth. Early diagnosis and consequent early treatment of aggressive periodontitis can stop the disease and therefore avoid the development of a periodontal disease in the permanent dentition. A close collaboration between specialists of different disciplines is required for a favourable outcome.

  14. Sketching the pion's valence-quark generalised parton distribution

    Directory of Open Access Journals (Sweden)

    C. Mezrag

    2015-02-01

    Full Text Available In order to learn effectively from measurements of generalised parton distributions (GPDs, it is desirable to compute them using a framework that can potentially connect empirical information with basic features of the Standard Model. We sketch an approach to such computations, based upon a rainbow-ladder (RL truncation of QCD's Dyson–Schwinger equations and exemplified via the pion's valence dressed-quark GPD, Hπv(x,ξ,t. Our analysis focuses primarily on ξ=0, although we also capitalise on the symmetry-preserving nature of the RL truncation by connecting Hπv(x,ξ=±1,t with the pion's valence-quark parton distribution amplitude. We explain that the impulse-approximation used hitherto to define the pion's valence dressed-quark GPD is generally invalid owing to omission of contributions from the gluons which bind dressed-quarks into the pion. A simple correction enables us to identify a practicable improvement to the approximation for Hπv(x,0,t, expressed as the Radon transform of a single amplitude. Therewith we obtain results for Hπv(x,0,t and the associated impact-parameter dependent distribution, qπv(x,|b→⊥|, which provide a qualitatively sound picture of the pion's dressed-quark structure at a hadronic scale. We evolve the distributions to a scale ζ=2 GeV, so as to facilitate comparisons in future with results from experiment or other nonperturbative methods.

  15. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    Science.gov (United States)

    Yee, H. C.; Shinn, Judy L.

    1987-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogeneous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the source terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  16. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    Directory of Open Access Journals (Sweden)

    Zheng Jing

    Full Text Available The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China. The modeling results were in an excellent agreement with measured data.

  17. Fearing shades of grey: individual differences in fear responding towards generalisation stimuli.

    Science.gov (United States)

    Arnaudova, Inna; Krypotos, Angelos-Miltiadis; Effting, Marieke; Kindt, Merel; Beckers, Tom

    2017-09-01

    Individual differences in fear generalisation have been proposed to play a role in the aetiology and/or maintenance of anxiety disorders, but few data are available to directly support that claim. The research that is available has focused mostly on generalisation of peripheral and central physiological fear responses. Far less is known about the generalisation of avoidance, the behavioural component of fear. In two experiments, we evaluated how neuroticism, a known vulnerability factor for anxiety, modulates an array of fear responses, including avoidance tendencies, towards generalisation stimuli (GS). Participants underwent differential fear conditioning, in which one conditioned stimulus (CS+) was repeatedly paired with an aversive outcome (shock; unconditioned stimulus, US), whereas another was not (CS-). Fear generalisation was observed across measures in Experiment 1 (US expectancy and evaluative ratings) and Experiment 2 (US expectancy, evaluative ratings, skin conductance, startle responses, safety behaviours), with overall highest responding to the CS+, lowest to the CS- and intermediate responding to the GSs. Neuroticism had very little impact on fear generalisation (but did affect GS recognition rates in Experiment 1), in line with the idea that fear generalisation is largely an adaptive process.

  18. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    Science.gov (United States)

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  19. Radiation Protection Aspects of Primary Water Chemistry and Source-term Management Report

    International Nuclear Information System (INIS)

    2014-04-01

    Since the beginning of the 1990's, occupational exposures in nuclear power plant has strongly decreased, outlining efforts achieved by worldwide nuclear operators in order to reach and maintain occupational exposure as low as reasonably achievable (ALARA) in accordance with international recommendations and national regulations. These efforts have focused on both technical and organisational aspects. According to many radiation protection experts, one of the key features to reach this goal is the management of the primary system water chemistry and the ability to avoid dissemination of radioactivity within the system. It outlines the importance for radiation protection staff to work closely with chemistry staff (as well as operation staff) and thus to have sufficient knowledge to understand the links between chemistry and the generation of radiation field. This report was prepared with the primary objective to provide such knowledge to 'non-chemist'. The publication primarily focuses on three topics dealing with water chemistry, source term management and remediation techniques. One key objective of the report is to provide current knowledge regarding these topics and to address clearly related radiation protection issues. In that mind, the report prepared by the EGWC was also reviewed by radiation protection experts. In order to address various designs, PWRs, VVERs, PHWRs and BWRs are addressed within the document. Additionally, available information addressing current operating units and lessons learnt is outlined with choices that have been made for the design of new plants. Chapter 3 of this report addresses current practices regarding primary chemistry management for different designs, 'how to limit activity in the primary circuit and to minimise contamination'. General information is provided regarding activation, corrosion and transport of activated materials in the primary circuit (background on radiation field generation). Primary chemistry aspects that

  20. Inverse analysis and regularisation in conditional source-term estimation modelling

    Science.gov (United States)

    Labahn, Jeffrey W.; Devaud, Cecile B.; Sipkens, Timothy A.; Daun, Kyle J.

    2014-05-01

    Conditional Source-term Estimation (CSE) obtains the conditional species mass fractions by inverting a Fredholm integral equation of the first kind. In the present work, a Bayesian framework is used to compare two different regularisation methods: zeroth-order temporal Tikhonov regulatisation and first-order spatial Tikhonov regularisation. The objectives of the current study are: (i) to elucidate the ill-posedness of the inverse problem; (ii) to understand the origin of the perturbations in the data and quantify their magnitude; (iii) to quantify the uncertainty in the solution using different priors; and (iv) to determine the regularisation method best suited to this problem. A singular value decomposition shows that the current inverse problem is ill-posed. Perturbations to the data may be caused by the use of a discrete mixture fraction grid for calculating the mixture fraction PDF. The magnitude of the perturbations is estimated using a box filter and the uncertainty in the solution is determined based on the width of the credible intervals. The width of the credible intervals is significantly reduced with the inclusion of a smoothing prior and the recovered solution is in better agreement with the exact solution. The credible intervals for temporal and spatial smoothing are shown to be similar. Credible intervals for temporal smoothing depend on the solution from the previous time step and a smooth solution is not guaranteed. For spatial smoothing, the credible intervals are not dependent upon a previous solution and better predict characteristics for higher mixture fraction values. These characteristics make spatial smoothing a promising alternative method for recovering a solution from the CSE inversion process.

  1. Generalisability theory analyses of concept mapping assessment scores in a problem-based medical curriculum.

    Science.gov (United States)

    Kassab, Salah E; Fida, Mariam; Radwan, Ahmed; Hassan, Adla B; Abu-Hijleh, Marwan; O'Connor, Brian P

    2016-07-01

    In problem-based learning (PBL), students construct concept maps that integrate different concepts related to the PBL case and are guided by the learning needs generated in small-group tutorials. Although an instrument to measure students' concept maps in PBL programmes has been developed, the psychometric properties of this instrument have not yet been assessed. This study evaluated the generalisability of and sources of variance in medical students' concept map assessment scores in a PBL context. Medical students (Year 4, n = 116) were asked to construct three integrated concept maps in which the content domain of each map was to be focused on a PBL clinical case. Concept maps were independently evaluated by four raters based on five criteria: valid selection of concepts; hierarchical arrangement of concepts; degree of integration; relationship to the context of the problem, and degree of student creativity. Generalisability theory was used to compute the reliability of the concept map scores. The dependability coefficient, which indicates the reliability of scores across the measured facets for making absolute decisions, was 0.814. Students' concept map scores (universe scores) accounted for the largest proportion of total variance (47%) across all score comparisons. Rater differences accounted for 10% of total variance, and the student × rater interaction accounted for 25% of total variance. The variance attributable to differences in the content domain of the maps was negligible (2%). The remaining 16% of the variance reflected unexplained sources of error. Results from the D study suggested that a dependability level of 0.80 can be achieved by using three raters who each score two concept map domains, or by using five raters who each score only one concept map domain. This study demonstrated that concept mapping assessment scores of medical students in PBL have high reliability. Results suggested that greater improvements in dependability might be made

  2. The long-term relationships among China's energy consumption sources and adjustments to its renewable energy policy

    International Nuclear Information System (INIS)

    Zou Gaolu

    2012-01-01

    To reduce its consumption of coal and oil in its primary energy consumption, China promotes the development of renewable energy resources. I have analysed the long-term relationship among China's primary energy consumption sources. Changes in coal consumption lead those in the consumption of other energy sources in the long term. Coal and oil fuels substitute for each other equally. The long-term elasticities of China's coal consumption relative to its hydroelectricity consumption were greater than one and nearly equal during the two sample periods. Therefore, increased hydroelectricity consumption did not imply a reduction in coal consumption. China holds abundant hydroelectricity, wind and, solar energy potential. China must prevent an excessive escalation of its economy and resultant energy demand to realise a meaningful substitution of coal with hydroelectricity. Moreover, China must develop and use wind and solar energy sources. Natural gas can be a good substitute for coal, given its moderate price growth and affordable price levels. - Highlights: ► Coal consumption changes lead those of other energy sources in the long term. ► Coal and oil fuels substitute for each other equally. ► Increased hydroelectricity consumption has not meant lower coal consumption. ► Wind, solar and natural gas are China's promising energy sources.

  3. Classical r-matrices for the generalised Chern–Simons formulation of 3d gravity

    Science.gov (United States)

    Osei, Prince K.; Schroers, Bernd J.

    2018-04-01

    We study the conditions for classical r-matrices to be compatible with the generalised Chern–Simons action for 3d gravity. Compatibility means solving the classical Yang–Baxter equations with a prescribed symmetric part for each of the real Lie algebras and bilinear pairings arising in the generalised Chern–Simons action. We give a new construction of r-matrices via a generalised complexification and derive a non-linear set of matrix equations determining the most general compatible r-matrix. We exhibit new families of solutions and show that they contain some known r-matrices for special parameter values.

  4. The development of a realistic source term for sodium-cooled fast reactors : assessment of current status and future needs.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Phillips, Jesse; Parma, Edward J., Jr.; Olivier, Tara Jean; Middleton, Bobby D.

    2011-06-01

    Sodium-cooled fast reactors (SFRs) continue to be proposed and designed throughout the United States and the world. Although the number of SFRs actually operating has declined substantially since the 1980s, a significant interest in advancing these types of reactor systems remains. Of the many issues associated with the development and deployment of SFRs, one of high regulatory importance is the source term to be used in the siting of the reactor. A substantial amount of modeling and experimental work has been performed over the past four decades on accident analysis, sodium coolant behavior, and radionuclide release for SFRs. The objective of this report is to aid in determining the gaps and issues related to the development of a realistic, mechanistically derived source term for SFRs. This report will allow the reader to become familiar with the severe accident source term concept and gain a broad understanding of the current status of the models and experimental work. Further, this report will allow insight into future work, in terms of both model development and experimental validation, which is necessary in order to develop a realistic source term for SFRs.

  5. Interference effects of neutral MSSM Higgs bosons with a generalised narrow-width approximation

    International Nuclear Information System (INIS)

    Fuchs, Elina

    2014-11-01

    Mixing effects in the MSSM Higgs sector can give rise to a sizeable interference between the neutral Higgs bosons. On the other hand, factorising a more complicated process into production and decay parts by means of the narrow-width approximation (NWA) simplifies the calculation. The standard NWA, however, does not account for interference terms. Therefore, we introduce a generalisation of the NWA (gNWA) which allows for a consistent treatment of interference effects between nearly mass-degenerate particles. Furthermore, we apply the gNWA at the tree and 1-loop level to an example process where the neutral Higgs bosons h and H are produced in the decay of a heavy neutralino and subsequently decay into a fermion pair. The h-H propagator mixing is found to agree well with the approximation of Breit-Wigner propagators times finite wave-function normalisation factors, both leading to a significant interference contribution. The factorisation of the interference term based on on-shell matrix elements reproduces the full interference result within a precision of better than 1% for the considered process. The gNWA also enables the inclusion of contributions beyond the 1-loop order into the most precise prediction.

  6. Generalised ballooning theory of two-dimensional tokamak modes

    Science.gov (United States)

    Abdoul, P. A.; Dickinson, D.; Roach, C. M.; Wilson, H. R.

    2018-02-01

    In this work, using solutions from a local gyrokinetic flux-tube code combined with higher order ballooning theory, a new analytical approach is developed to reconstruct the global linear mode structure with associated global mode frequency. In addition to the isolated mode (IM), which usually peaks on the outboard mid-plane, the higher order ballooning theory has also captured other types of less unstable global modes: (a) the weakly asymmetric ballooning theory (WABT) predicts a mixed mode (MM) that undergoes a small poloidal shift away from the outboard mid-plane, (b) a relatively more stable general mode (GM) balloons on the top (or bottom) of the tokamak plasma. In this paper, an analytic approach is developed to combine these disconnected analytical limits into a single generalised ballooning theory. This is used to investigate how an IM behaves under the effect of sheared toroidal flow. For small values of flow an IM initially converts into a MM where the results of WABT are recaptured, and eventually, as the flow increases, the mode asymptotically becomes a GM on the top (or bottom) of the plasma. This may be an ingredient in models for understanding why in some experimental scenarios, instead of large edge localised modes (ELMs), small ELMs are observed. Finally, our theory can have other important consequences, especially for calculations involving Reynolds stress driven intrinsic rotation through the radial asymmetry in the global mode structures. Understanding the intrinsic rotation is significant because external torque in a plasma the size of ITER is expected to be relatively low.

  7. Reducing plant radiation fields by source term reduction - tracking cobalt and antimony to their sources at Gentilly-2

    International Nuclear Information System (INIS)

    Gauthier, P.; Guzonas, D.A.

    2006-01-01

    Gentilly-2 NGS is experiencing high radiation fields in the fuelling machine vaults. These high fields make maintenance outages more expensive and their management more complicated. As part of the station refurbishment project, a task group was created to identify the cause of the high fields and make recommendations to prevent their reoccurrence in the second (post-refurbishment) operating cycle. To identify the root cause of the problem, the task group decided to analyse the primary heat transport system (PHTS), the fuel handling system and their inter-relation. Gentilly-2 has had to manage a unique (to CANDU) problem arising from antimony released from the main heat transport pump seals. Antimony deposits on in-core surfaces, becomes activated, and subsequently can be released, especially under oxidizing coolant conditions. It then becomes incorporated into the magnetite deposits on PHTS piping, including the steam generators and inlet feeders. Gentilly-2 has focused a great deal of effort on managing antimony over the last 15 years. As a result of these initiatives, radioantimony fields have been quite effectively managed since 1997, resulting in a decrease in their relative contribution to the total fields. The decrease in radioantimony fields highlighted the significant contribution of 60 Co cobalt activity; the high levels of both radioantimony and 60 Co differentiate Gentilly-2 from other CANDU 6 plants. Two types of 59 Co sources are present in the CANDU PHTS. High surface area materials such as steam generator tubes and feeder pipes contain trace concentrations of 59 Co as an impurity, which can be released by corrosion. Low surface area materials such as Stellites contain high concentrations of 59 Co that can be released as either corrosion or wear products. After assessing potential cobalt sources, the task group concluded that PHTS materials were not likely the origin of the high 60 Co fields. The major PHTS components identified as cobalt sources have

  8. Application of a generalisation of the Kohn variational method to the calculation of cross sections for low-energy positron-hydrogen-molecule scattering

    International Nuclear Information System (INIS)

    Armour, E.A.G.

    1984-01-01

    The phaseshift corresponding to the lowest partial wave and the associated approximation to the total cross section are calculated for low-energy positron-hydrogen-molecule scattering using a generalisation of the Kohn variational method. The trial wavefunction is expressed in terms of confocal elliptical coordinates. Except at incident positron energies below about 2 eV, reasonable agreement with experiment is obtained below the positronium formation threshold at 8.63 eV. (author)

  9. Combination of generalised neurofibromatosis (Recklinghausen's disease) and agenesia of the corpus callosum

    International Nuclear Information System (INIS)

    Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.

    1989-01-01

    Whereas generalised neufibromatosis is a relatively frequent disease its combined occurence in conjunction with agenesia of the corpus callosum is extremely rare and probably a casual coincidence. (orig.) [de

  10. Dosimetric quantities and basic data for the evaluation of generalised derived limits

    International Nuclear Information System (INIS)

    Harrison, N.T.; Simmonds, J.R.

    1980-12-01

    The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)

  11. Generalised brain edema and brain infarct in ergotamine abuse: Visualization by CT, MR and angiography

    International Nuclear Information System (INIS)

    Toedt, C.; Hoetzinger, H.; Salbeck, R.; Beyer, H.K.

    1989-01-01

    Abuse of ergotamine can release a generalised brain edema and brain infarctions. This can be visualized by CT, MR and angiography. The reason, however, can only be found in the patients history. (orig.) [de

  12. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  13. Application of natural generalised inverse technique in reconstruction of gravity anomalies due to a fault

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, M.M.M.; Murty, T.V.R.; Murthy, K.S.R.; Vasudeva, R.Y.

    has been performed to build Generalised Inverse Operator (GIO) and it is operated on the observed anomaly with reference to the calculated anomaly to update model parameters. Data and model resolution matrices are computed to check the correctness...

  14. Source term derivation and radiological safety analysis for the TRICO II research reactor in Kinshasa

    Energy Technology Data Exchange (ETDEWEB)

    Muswema, J.L., E-mail: jeremie.muswem@unikin.ac.cd [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Ekoko, G.B. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Lukanda, V.M. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Democratic Republic of the Congo' s General Atomic Energy Commission, P.O. Box AE1 (Congo, The Democratic Republic of the); Lobo, J.K.-K. [Faculty of Science, University of Kinshasa, P.O. Box 190, KIN XI (Congo, The Democratic Republic of the); Darko, E.O. [Radiation Protection Institute, Ghana Atomic Energy Commission, P.O. Box LG 80, Legon, Accra (Ghana); Boafo, E.K. [University of Ontario Institute of Technology, 2000 Simcoe St. North, Oshawa, ONL1 H7K4 (Canada)

    2015-01-15

    Highlights: • Atmospheric dispersion modeling for two credible accidents of the TRIGA Mark II research reactor in Kinshasa (TRICO II) was performed. • Radiological safety analysis after the postulated initiating events (PIE) was also carried out. • The Karlsruhe KORIGEN and the HotSpot Health Physics codes were used to achieve the objectives of this study. • All the values of effective dose obtained following the accident scenarios were below the regulatory limits for reactor staff members and the public, respectively. - Abstract: The source term from the 1 MW TRIGA Mark II research reactor core of the Democratic Republic of the Congo was derived in this study. An atmospheric dispersion modeling followed by radiation dose calculation were performed based on two possible postulated accident scenarios. This derivation was made from an inventory of peak radioisotope activities released in the core by using the Karlsruhe version of isotope generation code KORIGEN. The atmospheric dispersion modeling was performed with HotSpot code, and its application yielded to radiation dose profile around the site using meteorological parameters specific to the area under study. The two accident scenarios were picked from possible accident analyses for TRIGA and TRIGA-fueled reactors, involving the case of destruction of the fuel element with highest activity release and a plane crash on the reactor building as the worst case scenario. Deterministic effects of these scenarios are used to update the Safety Analysis Report (SAR) of the reactor, and for its current version, these scenarios are not yet incorporated. Site-specific meteorological conditions were collected from two meteorological stations: one installed within the Atomic Energy Commission and another at the National Meteorological Agency (METTELSAT), which is not far from the site. Results show that in both accident scenarios, radiation doses remain within the limits, far below the recommended maximum effective

  15. Source term derivation and radiological safety analysis for the TRICO II research reactor in Kinshasa

    International Nuclear Information System (INIS)

    Muswema, J.L.; Ekoko, G.B.; Lukanda, V.M.; Lobo, J.K.-K.; Darko, E.O.; Boafo, E.K.

    2015-01-01

    Highlights: • Atmospheric dispersion modeling for two credible accidents of the TRIGA Mark II research reactor in Kinshasa (TRICO II) was performed. • Radiological safety analysis after the postulated initiating events (PIE) was also carried out. • The Karlsruhe KORIGEN and the HotSpot Health Physics codes were used to achieve the objectives of this study. • All the values of effective dose obtained following the accident scenarios were below the regulatory limits for reactor staff members and the public, respectively. - Abstract: The source term from the 1 MW TRIGA Mark II research reactor core of the Democratic Republic of the Congo was derived in this study. An atmospheric dispersion modeling followed by radiation dose calculation were performed based on two possible postulated accident scenarios. This derivation was made from an inventory of peak radioisotope activities released in the core by using the Karlsruhe version of isotope generation code KORIGEN. The atmospheric dispersion modeling was performed with HotSpot code, and its application yielded to radiation dose profile around the site using meteorological parameters specific to the area under study. The two accident scenarios were picked from possible accident analyses for TRIGA and TRIGA-fueled reactors, involving the case of destruction of the fuel element with highest activity release and a plane crash on the reactor building as the worst case scenario. Deterministic effects of these scenarios are used to update the Safety Analysis Report (SAR) of the reactor, and for its current version, these scenarios are not yet incorporated. Site-specific meteorological conditions were collected from two meteorological stations: one installed within the Atomic Energy Commission and another at the National Meteorological Agency (METTELSAT), which is not far from the site. Results show that in both accident scenarios, radiation doses remain within the limits, far below the recommended maximum effective

  16. Determination of a source term for a time fractional diffusion equation with an integral type over-determining condition

    Directory of Open Access Journals (Sweden)

    Timurkhan S. Aleroev

    2013-12-01

    Full Text Available We consider a linear heat equation involving a fractional derivative in time, with a nonlocal boundary condition. We determine a source term independent of the space variable, and the temperature distribution for a problem with an over-determining condition of integral type. We prove the existence and uniqueness of the solution, and its continuous dependence on the data.

  17. Nonradioactive Environmental Emissions Chemical Source Term for the Double Shell Tank (DST) Vapor Space During Waste Retrieval Operations

    Energy Technology Data Exchange (ETDEWEB)

    MAY, T.H.

    2000-04-21

    A nonradioactive chemical vapor space source term for tanks on the Phase 1 and the extended Phase 1 delivery, storage, and disposal mission was determined. Operations modeled included mixer pump operation and DST waste transfers. Concentrations of ammonia, specific volatile organic compounds, and quantitative volumes of aerosols were estimated.

  18. Use of WIMS-E lattice code for prediction of the transuranic source term for spent fuel dose estimation

    International Nuclear Information System (INIS)

    Schwinkendorf, K.N.

    1996-01-01

    A recent source term analysis has shown a discrepancy between ORIGEN2 transuranic isotopic production estimates and those produced with the WIMS-E lattice physics code. Excellent agreement between relevant experimental measurements and WIMS-E was shown, thus exposing an error in the cross section library used by ORIGEN2

  19. Diagnosis and prognosis of the source term by the French Safety Institut during an emergency on a PWR

    International Nuclear Information System (INIS)

    Chauliac, C.; Janot, L.; Jouzier, A.; Rague, B.

    1992-01-01

    The French approach for the diagnosis and the prognosis of the source term during an accident on a PWR is presented and the tools which have been developed to implement this approach at the Institute for Nuclear Protection and Safety (IPSN) are described. (author). 2 refs, 3 figs

  20. Bayesian Inference for Source Term Estimation: Application to the International Monitoring System Radionuclide Network

    Science.gov (United States)

    2014-10-01

    Laboratories (CRL) medical isotope production facility. The sampling of the resulting posterior distribution of the source parameters is un- dertaken...International Monitoring System radionuclide network used for Case 2. The location of the Xe-133 tracer source (red marker) was at Chalk River Laboratories ...space. 5 Applications The International Monitoring System (IMS) consists of a comprehensive network of seismic, hydroacoustic , infrasound, and

  1. Benchmarking the New RESRAD-OFFSITE Source Term Model with DUST-MS and GoldSim - 13377

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.J.; Kamboj, S.; Gnanapragasam, E.; Yu, C. [Argonne National Laboratory, Argonne, IL 60439 (United States)

    2013-07-01

    RESRAD-OFFSITE is a computer code developed by Argonne National Laboratory under the sponsorship of U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC). It is designed on the basis of RESRAD (onsite) code, a computer code designated by DOE and NRC for evaluating soil-contaminated sites for compliance with human health protection requirements pertaining to license termination or environmental remediation. RESRAD-OFFSITE has enhanced capabilities of modeling radionuclide transport to offsite locations and calculating potential radiation exposure to offsite receptors. Recently, a new source term model was incorporated into RESRAD-OFFSITE to enhance its capability further. This new source term model allows simulation of radionuclide releases from different waste forms, in addition to the soil sources originally considered in RESRAD (onsite) and RESRAD-OFFSITE codes. With this new source term model, a variety of applications can be achieved by using RESRAD-OFFSITE, including but not limited to, assessing the performance of radioactive waste disposal facilities. This paper presents the comparison of radionuclide release rates calculated by the new source term model of RESRAD-OFFSITE versus those calculated by DUST-MS and GoldSim, respectively. The focus of comparison is on the release rates of radionuclides from the bottom of the contaminated zone that was assumed to contain radioactive source materials buried in soil. The transport of released contaminants outside of the primary contaminated zone is beyond the scope of this paper. Overall, the agreement between the RESRAD-OFFSITE results and the DUST-MS and GoldSim results is fairly good, with all three codes predicting identical or similar radionuclide release profiles over time. Numerical dispersion in the DUST-MS and GoldSim results was identified as potentially contributing to the disagreement in the release rates. In general, greater discrepancy in the release rates was found for short

  2. Benchmarking the New RESRAD-OFFSITE Source Term Model with DUST-MS and GoldSim - 13377

    International Nuclear Information System (INIS)

    Cheng, J.J.; Kamboj, S.; Gnanapragasam, E.; Yu, C.

    2013-01-01

    RESRAD-OFFSITE is a computer code developed by Argonne National Laboratory under the sponsorship of U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC). It is designed on the basis of RESRAD (onsite) code, a computer code designated by DOE and NRC for evaluating soil-contaminated sites for compliance with human health protection requirements pertaining to license termination or environmental remediation. RESRAD-OFFSITE has enhanced capabilities of modeling radionuclide transport to offsite locations and calculating potential radiation exposure to offsite receptors. Recently, a new source term model was incorporated into RESRAD-OFFSITE to enhance its capability further. This new source term model allows simulation of radionuclide releases from different waste forms, in addition to the soil sources originally considered in RESRAD (onsite) and RESRAD-OFFSITE codes. With this new source term model, a variety of applications can be achieved by using RESRAD-OFFSITE, including but not limited to, assessing the performance of radioactive waste disposal facilities. This paper presents the comparison of radionuclide release rates calculated by the new source term model of RESRAD-OFFSITE versus those calculated by DUST-MS and GoldSim, respectively. The focus of comparison is on the release rates of radionuclides from the bottom of the contaminated zone that was assumed to contain radioactive source materials buried in soil. The transport of released contaminants outside of the primary contaminated zone is beyond the scope of this paper. Overall, the agreement between the RESRAD-OFFSITE results and the DUST-MS and GoldSim results is fairly good, with all three codes predicting identical or similar radionuclide release profiles over time. Numerical dispersion in the DUST-MS and GoldSim results was identified as potentially contributing to the disagreement in the release rates. In general, greater discrepancy in the release rates was found for short

  3. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  4. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    2016-04-17

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooled fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the

  5. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  6. Generalised joint hypermobility and neurodevelopmental traits in a non-clinical adult population.

    Science.gov (United States)

    Glans, Martin; Bejerot, Susanne; Humble, Mats B

    2017-09-01

    Generalised joint hypermobility (GJH) is reportedly overrepresented among clinical cases of attention deficit/hyperactivity disorder (ADHD), autism spectrum disorder (ASD) and developmental coordination disorder (DCD). It is unknown if these associations are dimensional and, therefore, also relevant among non-clinical populations. To investigate if GJH correlates with sub-syndromal neurodevelopmental symptoms in a normal population. Hakim-Grahame's 5-part questionnaire (5PQ) on GJH, neuropsychiatric screening scales measuring ADHD and ASD traits, and a DCD-related question concerning clumsiness were distributed to a non-clinical, adult, Swedish population ( n =1039). In total, 887 individuals met our entry criteria. We found no associations between GJH and sub-syndromal symptoms of ADHD, ASD or DCD. Although GJH is overrepresented in clinical cases with neurodevelopmental disorders, such an association seems absent in a normal population. Thus, if GJH serves as a biomarker cutting across diagnostic boundaries, this association is presumably limited to clinical populations. None. © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license.

  7. Spud and FLML: generalising and automating the user interfaces of scientific computer models

    Science.gov (United States)

    Ham, D. A.; Farrell, P. E.; Maddison, J. R.; Gorman, G. J.; Wilson, C. R.; Kramer, S. C.; Shipton, J.; Collins, G. S.; Cotter, C. J.; Piggott, M. D.

    2009-04-01

    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. We present a model-independent system, Spud[1], which formalises the specification of model input formats in terms of formal grammars. This is combined with an automatically generated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module which minimises the development cost of adding model options. We further present FLML, the Fluidity Markup Language. FLML applies Spud to the Imperial College Ocean Model (ICOM) resulting in a graphically driven system which radically improves the usability of ICOM. As well as a step forward for ICOM, FLML illustrates how the Spud system can be applied to an existing complex ocean model highlighting the potential of Spud as a user interface for other codes in the ocean modelling community. [1] Ham, D. A. et.al, Spud 1.0: generalising and automating the user interfaces of scientific computer models, Geosci. Model Dev. Discuss., 1, 125-146, 2008.

  8. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  9. DNA evolutionary algorithm (DNAEA) for source term identification in convection-diffusion equation

    International Nuclear Information System (INIS)

    Yang, X-H; Hu, X-X; Shen, Z-Y

    2008-01-01

    The source identification problem is changed into an optimization problem in this paper. This is a complicated nonlinear optimization problem. It is very intractable with traditional optimization methods. So DNA evolutionary algorithm (DNAEA) is presented to solve the discussed problem. In this algorithm, an initial population is generated by a chaos algorithm. With the shrinking of searching range, DNAEA gradually directs to an optimal result with excellent individuals obtained by DNAEA. The position and intensity of pollution source are well found with DNAEA. Compared with Gray-coded genetic algorithm and pure random search algorithm, DNAEA has rapider convergent speed and higher calculation precision

  10. Short-term power sources for tokamaks and other physical experiments

    Czech Academy of Sciences Publication Activity Database

    Zajac, Jaromír; Žáček, František; Brettschneider, Zbyněk; Lejsek, V.

    2007-01-01

    Roč. 82, č. 4 (2007), s. 369-379 ISSN 0920-3796 Institutional research plan: CEZ:AV0Z20430508 Keywords : Tokamak * Impulse power sources * Energy accumulation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.058, year: 2007 http://www.sciencedirect.com/science/journal/09203796

  11. Measurement and apportionment of radon source terms for modeling indoor environments

    International Nuclear Information System (INIS)

    Harley, N.H.

    1990-01-01

    This research has two main goals; (1) to quantify mechanisms for radon entry into homes of different types and to determine the fraction of indoor radon attributable to each source and (2) to model and calculate the dose (and therefore alpha particle fluence) to cells in the human and animal tracheobronchial tree that is pertinent to induction of bronchogenic carcinoma from inhaled radon daughters

  12. private placements as sources of long term funds for publicly quoted

    African Journals Online (AJOL)

    USER

    (proportion of profit that is not spent). External sources of funds include debt or equity financing or both. For instance, when firms need to raise capital they may choose to sell (or float) new securities. These new issues of stocks, bonds, or other hybrid securities typically, are marketed to the public by investment bankers on.

  13. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  14. Geometric discretization of the multidimensional Dirac delta distribution - Application to the Poisson equation with singular source terms

    Science.gov (United States)

    Egan, Raphael; Gibou, Frédéric

    2017-10-01

    We present a discretization method for the multidimensional Dirac distribution. We show its applicability in the context of integration problems, and for discretizing Dirac-distributed source terms in Poisson equations with constant or variable diffusion coefficients. The discretization is cell-based and can thus be applied in a straightforward fashion to Quadtree/Octree grids. The method produces second-order accurate results for integration. Superlinear convergence is observed when it is used to model Dirac-distributed source terms in Poisson equations: the observed order of convergence is 2 or slightly smaller. The method is consistent with the discretization of Dirac delta distribution for codimension one surfaces presented in [1,2]. We present Quadtree/Octree construction procedures to preserve convergence and present various numerical examples, including multi-scale problems that are intractable with uniform grids.

  15. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Son, Han Seong; Song, Deok Yong [ENESYS, Taejon (Korea, Republic of); Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of)

    2006-07-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors.

  16. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    International Nuclear Information System (INIS)

    Son, Han Seong; Song, Deok Yong; Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon

    2006-01-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors

  17. Development and application of knowledge-based source-term models for radionuclide mobilisation from contaminated concrete

    International Nuclear Information System (INIS)

    Deissmann, G.; Thierfeldt, S.; Woerlen, S.; Bath, A.; Jefferis, S.

    2006-01-01

    Concrete materials in nuclear facilities may become activated or contaminated by various radionuclides through different mechanisms. Consequently, decommissioning and dismantling of these facilities produce considerable quantities of these materials (e.g. concrete structures, rubble), which are at least potentially contaminated with radionuclides and which must be managed safely and cost-effectively. In this paper, we present results from a research project that aims at the development of source-term models for the mobilization of radionuclides from contaminated concrete. The objective of this task was to clarify whether a more realistic source-term description could be beneficial for optimization of the management of decommissioning wastes by reducing the amount of material for disposal as radioactive waste as well as by saving natural resources due to the recycling of building materials. To identify important parameters and processes that affect the release rates of radionuclides, we evaluated the chemical behavior and the solid speciation of radionuclides in concrete materials and the influence of factors like concrete properties, source/pathway of contamination, and the scenario-specific chemical environment and hydraulic regime. Furthermore, concrete degradation processes and their influence on contaminant mobilization were addressed. On this basis, source-term models were developed to describe the radionuclide release by (i) the dissolution of radionuclide containing solid phases, (ii) the desorption of radionuclides from surfaces, and/or (iii) the leaching of radionuclides from a solid matrix without disrupting its structure. These source-term models were parameterized for probabilistic simulations of various release options, including the reuse of recycled building materials, the disposal of rubble in inert and municipal landfills as well as the on-site disposal of concrete materials (e.g. foundations remaining in the ground, in situ burial of rubble). For

  18. Long-term variability in sugarcane bagasse feedstock compositional methods: sources and magnitude of analytical variability.

    Science.gov (United States)

    Templeton, David W; Sluiter, Justin B; Sluiter, Amie; Payne, Courtney; Crocker, David P; Tao, Ling; Wolfrum, Ed

    2016-01-01

    In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each), were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good

  19. Hydrologic Source Term Processes and Models for the Clearwater and Wineskin Tests, Rainier Mesa, Nevada National Security Site

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Steven F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-05-04

    This report describes the development, processes, and results of a hydrologic source term (HST) model for the CLEARWATER (U12q) and WINESKIN (U12r) tests located on Rainier Mesa, Nevada National Security Site, Nevada (Figure 1.1). Of the 61 underground tests (involving 62 unique detonations) conducted on Rainier Mesa (Area 12) between 1957 and 1992 (USDOE, 2015), the CLEARWATER and WINESKIN tests present many unique features that warrant a separate HST modeling effort from other Rainier Mesa tests.

  20. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  1. Demographic and psychosocial predictors of major depression and generalised anxiety disorder in Australian university students.

    Science.gov (United States)

    Farrer, Louise M; Gulliver, Amelia; Bennett, Kylie; Fassnacht, Daniel B; Griffiths, Kathleen M

    2016-07-15

    Few studies have examined modifiable psychosocial risk factors for mental disorders among university students, and of these, none have employed measures that correspond to clinical diagnostic criteria. The aim of this study was to examine psychosocial and demographic risk factors for major depression and generalised anxiety disorder (GAD) in a sample of Australian university students. An anonymous web-based survey was distributed to undergraduate and postgraduate students at a mid-sized Australian university. A range of psychosocial and demographic risk factors were measured, and logistic regression models were used to examine significant predictors of major depression and GAD. A total of 611 students completed the survey. The prevalence of major depression and GAD in the sample was 7.9 and 17.5 %, respectively. In terms of demographic factors, the risk of depression was higher for students in their first year of undergraduate study, and the risk of GAD was higher for female students, those who moved to attend university, and students experiencing financial stress. In terms of psychosocial factors, students with experience of body image issues and lack of confidence were at significantly greater risk of major depression, and feeling too much pressure to succeed, lack of confidence, and difficulty coping with study was significantly associated with risk of GAD. University students experience a range of unique psychosocial stressors that increase their risk of major depression and GAD, in addition to sociodemographic risk factors. It is important to examine psychosocial factors, as these are potentially modifiable and could be the focus of university-specific mental health interventions.

  2. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  3. A potention of renewable energy sources in Slovakia in term of production of electricity

    Directory of Open Access Journals (Sweden)

    Štefan Kuzevič

    2005-11-01

    Full Text Available Electro-energetics of Slovak Republic is in this time in state of re-structuralization consequent from responsibilities which SR has with integration to the EU and on the other hand with actual status of production capacities of fossil fuels using in heat power stations and heat stations also the utilization of nuclear energy in nuclear power stations Jaslovské Bohunice and Mochovce. Paradoxically slim representation in production capacities have renewable energy sources, while only one relevant one is utilization of water in small hydro power stations. According to fact, that to the year 2010, the share of renewable sources of energy using in comparing with electric energy has to achieve 21,7% (direction of EU 77/2001. It is necessary to evaluate possibilities of utilization and to specify potential of utilization from technical and economical aspect.

  4. Estimates of Source Spectra of Ships from Long Term Recordings in the Baltic Sea

    Directory of Open Access Journals (Sweden)

    Ilkka Karasalo

    2017-06-01

    bathymetry data from the Baltic Sea Bathymetry Database (BSBD,sound speed profiles from the HIROMB oceanographic model,seabed parameters obtained by acoustic inversion of data from a calibrated source, andAIS data providing information on each ship's position.These TL spectra were then subtracted from the received noise spectra to estimate the free field source level (SL spectra for each passage. The SL were compared to predictions by some existing models of noise emission from ships. Input parameters to the models, including e.g., ship length, width, speed, displacement, and engine mass, were obtained from AIS (Automatic Identification System data and the STEAM database of the Finnish Metereological Institute (FMI.

  5. Long-term X-ray Observations of Galactic Superluminal Sources with GRANAT/WATCH

    DEFF Research Database (Denmark)

    Sazonov, S.Y.; Sunyaev, R.; Lund, Niels

    1996-01-01

    The authors present X-ray time histories for the radio-jet sources GRS 1915+105 and GRO J1655-40 observed by the GRANAT/WATCH all-sky monitor at 8-20 keV. GRS 1915+105 is extremely variable on the time scales of months to years. The analysis of a 3-year data set gives no evidence for periodicity...

  6. The exceptional generalised geometry of supersymmetric AdS flux backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford, Andrew Wiles Building,Woodstock Road, Oxford, OX2 6GG (United Kingdom); Petrini, Michela [Sorbonne Université, UPMC Paris 06, UMR 7589,LPTHE, 75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2016-12-29

    We analyse generic AdS flux backgrounds preserving eight supercharges in D=4 and D=5 dimensions using exceptional generalised geometry. We show that they are described by a pair of globally defined, generalised structures, identical to those that appear for flat flux backgrounds but with different integrability conditions. We give a number of explicit examples of such “exceptional Sasaki-Einstein” backgrounds in type IIB supergravity and M-theory. In particular, we give the complete analysis of the generic AdS{sub 5} M-theory backgrounds. We also briefly discuss the structure of the moduli space of solutions. In all cases, one structure defines a “generalised Reeb vector” that generates a Killing symmetry of the background corresponding to the R-symmetry of the dual field theory, and in addition encodes the generic contact structures that appear in the D=4 M-theory and D=5 type IIB cases. Finally, we investigate the relation between generalised structures and quantities in the dual field theory, showing that the central charge and R-charge of BPS wrapped-brane states are both encoded by the generalised Reeb vector, as well as discussing how volume minimisation (the dual of a- and F-maximisation) is encoded.

  7. The generalised anxiety stigma scale (GASS): psychometric properties in a community sample

    Science.gov (United States)

    2011-01-01

    Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD) or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS). Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212). Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder); and Perceived Stigma (views about the beliefs of most others in the community). There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS) subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder. PMID:22108099

  8. The generalised anxiety stigma scale (GASS: psychometric properties in a community sample

    Directory of Open Access Journals (Sweden)

    Griffiths Kathleen M

    2011-11-01

    Full Text Available Abstract Background Although there is substantial concern about negative attitudes to mental illness, little is known about the stigma associated with Generalised Anxiety Disorder (GAD or its measurement. The aim of this study was to develop a multi-item measure of Generalised Anxiety Disorder stigma (the GASS. Methods Stigma items were developed from a thematic analysis of web-based text about the stigma associated with GAD. Six hundred and seventeen members of the public completed a survey comprising the resulting 20 stigma items and measures designed to evaluate construct validity. Follow-up data were collected for a subset of the participants (n = 212. Results The factor structure comprised two components: Personal Stigma (views about Generalised Anxiety Disorder; and Perceived Stigma (views about the beliefs of most others in the community. There was evidence of good construct validity and reliability for each of the Generalised Anxiety Stigma Scale (GASS subscales. Conclusions The GASS is a promising brief measure of the stigma associated with Generalised Anxiety Disorder.

  9. Aquifers survey in the context of source rocks exploitation: from baseline acquisition to long term monitoring

    Science.gov (United States)

    Garcia, Bruno; Rouchon, Virgile; Deflandre, Jean-Pierre

    2017-04-01

    Producing hydrocarbons from source rocks (like shales: a mix of clays, silts, carbonate and sandstone minerals containing matured organic matter, i.e. kerogen oil and gas, but also non-hydrocarbon various species of chemical elements including sometimes radioactive elements) requires to create permeability within the rock matrix by at least hydraulically fracturing the source rock. It corresponds to the production of hydrocarbon fuels that have not been naturally expelled from the pressurized matured source rock and that remain trapped in the porosity or/and kerogen porosity of the impermeable matrix. Azimuth and extent of developed fractures can be respectively determined and mapped by monitoring the associated induced microseismicity. This allows to have an idea of where and how far injected fluids penetrated the rock formation. In a geological context, aquifers are always present in the vicinity -or on fluid migration paths- of such shale formations: deep aquifers (near the shale formation) up to sub-surface and potable (surface) aquifers. Our purpose will be to track any unsuitable invasion or migration of chemicals specifies coming from matured shales of production fluids including both drilling and fracturing ones into aquifers. Our objective is to early detect and alarm of any anomaly to avoid any important environmental issue. The approach consists in deploying a specific sampling tool within a well to recover formation fluids and to run a panoply of appropriate laboratory tests to state on fluid characteristics. Of course for deep aquifers, such a characterization process may consider aquifer properties prior producing shale oil and gas, as they may contain naturally some chemical species present in the source rocks. One can also consider that a baseline acquisition could be justified in case of possible previous invasion of non-natural fluids in the formation under survey (due to any anthropogenic action at surface or in the underground). The paper aims

  10. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    International Nuclear Information System (INIS)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-01-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO 2 ), methane (CH 4 ), carbon monoxide (CO), ozone (O 3 ), aerosol scattering coefficient (σ sp ), aerosol number concentration (NC asl ), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on atmospheric composition in the Arctic. We

  11. Identification of sources and long term trends for pollutants in the arctic using isentropic trajectory analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahura, A.; Jaffe, D.; Harris, J.

    2003-07-01

    The understanding of factors driving climate and ecosystem changes in the Arctic requires careful consideration of the sources, correlation and trends for anthropogenic pollutants. The database from the NOAA-CMDL Barrow Observatory (71deg.17'N, 156deg.47'W) is the longest and most complete record of pollutant measurements in the Arctic. It includes observations of carbon dioxide (CO{sub 2}), methane (CH{sub 4}), carbon monoxide (CO), ozone (O{sub 3}), aerosol scattering coefficient ({sigma}{sub sp}), aerosol number concentration (NC{sub asl}), etc. The objectives of this study are to understand the role of long-range transport to Barrow in explaining: (1) the year-to-year variations, and (2) the trends in the atmospheric chemistry record at the NOAA-CMDL Barrow observatory. The key questions we try to answer are: 1. What is the relationship between various chemical species measured at Barrow Observatory, Alaska and transport pathways at various altitudes? 2. What are the trends of species and their relation to transport patterns from the source regions? 3. What is the impact of the Prudhoe Bay emissions on the Barrow's records? To answer on these questions we apply the following main research tools. First, it is an isentropic trajectory model used to calculate the trajectories arriving at Barrow at three altitudes of 0.5, 1.5 and 3 km above sea level. Second - clustering procedure used to divide the trajectories into groups based on source regions. Third - various statistical analysis tools such as the exploratory data analysis, two component correlation analysis, trend analysis, principal components and factor analysis used to identify the relationship between various chemical species vs. source regions as a function of time. In this study, we used the chemical data from the NOAA-CMDL Barrow observatory in combination with isentropic backward trajectories from gridded ECMWF data to understand the importance of various pollutant source regions on

  12. A generalised solution for step-drawdown tests including flow ...

    African Journals Online (AJOL)

    drinie

    2001-07-03

    , South Africa. Abstract ..... factors may also contribute to turbulence in a producing borehole. - a high discharge rate and a restrictive ... discharge rate used in the test represents an acceptable measure for the long-term yield of ...

  13. Refined Source Terms in Wave Watch 3 with Wave Breaking and Sea Spray Forecasts

    Science.gov (United States)

    2016-08-05

    Hsiao and Shemdin, 1983; Plant , 1990), for reference. Note that the form of the Janssen (1991) growth rate parameterization has been largely followed... spectrum against the observed terms during the young wind sea growth episode in the Strait of Juan de Fuca reported by Schwendeman et al. (2014). The...suite of forecast sea state variables, for sea state conditions ranging from light to very severe. Our approach required a combination of

  14. LEAK: A source term generator for evaluating release rates from leaking vessels

    International Nuclear Information System (INIS)

    Clinton, J.H.

    1994-01-01

    An interactive computer code for estimating the rate of release of any one of several materials from a leaking tank or broken pipe leading from a tank is presented. It is generally assumed that the material in the tank is liquid. Materials included in the data base are acetonitrile, ammonia, carbon tetrachloride, chlorine, chlorine trifluoride, fluorine, hydrogen fluoride, nitric acid, nitrogen tetroxide, sodium hydroxide, sulfur hexafluoride, sulfuric acid, and uranium hexafluoride. Materials that exist only as liquid and/or vapor over expected ranges of temperature and pressure can easily be added to the data base file. The Fortran source code for LEAK and the data file are included with this report

  15. An Iterative Regularization Method for Identifying the Source Term in a Second Order Differential Equation

    Directory of Open Access Journals (Sweden)

    Fairouz Zouyed

    2015-01-01

    Full Text Available This paper discusses the inverse problem of determining an unknown source in a second order differential equation from measured final data. This problem is ill-posed; that is, the solution (if it exists does not depend continuously on the data. In order to solve the considered problem, an iterative method is proposed. Using this method a regularized solution is constructed and an a priori error estimate between the exact solution and its regularized approximation is obtained. Moreover, numerical results are presented to illustrate the accuracy and efficiency of this method.

  16. Long term leaching of chlorinated solvents from source zones in low permeability settings with fractures

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup; Chambon, Julie Claire Claudia; Troldborg, Mads

    2008-01-01

    under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights......, but the interaction with and processes in the matrix need to be further explored. Development of new methods for field site characterisation and integrated field and model expertise are crucial for the design of remedial actions and for risk assessment of contaminated sites in low permeability settings....

  17. Long-term population effect of male circumcision in generalised HIV ...

    African Journals Online (AJOL)

    A meta-analysis of that data, contrasting male HIV seroprevalence according to circumcision status, showed no difference between the two groups (combined risk ... In most countries with a complex ethnic fabric, the relationship between men's circumcision status and HIV seroprevalence was not straightforward, with the ...

  18. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  19. Meeting Czechoslovak demands for heat in long-term prospective, especially with regard to nuclear sources

    International Nuclear Information System (INIS)

    Klail, M.

    1988-01-01

    The development was studied of heat demand in the CSSR till the year 2030. The ratio of centralized and decentralized heat supply is currently 60 to 40; in the future a slight increase is expected in the decentralized type of heat supply, mainly as a result of more intensive use of natural gas. In 2030, 710 PU of centralized heat should be produced. A decisive element in meeting the demand will be a growing proportion of combined production of electric power and heat by nuclear power plants. The installed capacity of the nuclear power plants in 2030 should range between 23 and 41 thousand MW, the production of electric power in these plants should be 193 to 238 TWh/y. 109 territorial areas potentially suitable for use of heat from nuclear sources were selected. They were included in 19 regions of which 9 should in the year 2010 be linked to heat supply from nuclear power plants that will be in operation. It is expected that in the year 2030, nuclear sources will supply 250 PU of centralized heat. (Z.M.). 2 tabs., 14 refs

  20. Limited acquisition and generalisation of rhotics with ultrasound visual feedback in childhood apraxia.

    Science.gov (United States)

    Preston, Jonathan L; Maas, Edwin; Whittle, Jessica; Leece, Megan C; McCabe, Patricia

    2016-01-01

    Ultrasound visual feedback of the tongue is one treatment option for individuals with persisting speech sound errors. This study evaluated children's performance during acquisition and generalisation of American English rhotics using ultrasound feedback. Three children aged 10-13 with persisting speech sound errors associated with childhood apraxia of speech (CAS) were treated for 14 one-hour sessions. Two of the participants increased the accuracy of their rhotic production during practise trials within treatment sessions, but none demonstrated generalisation to untreated words. Lack of generalisation may be due to a failure to acquire the target with sufficient accuracy during treatment, or to co-existing linguistic weaknesses that are not addressed in a motor-based treatment. Results suggest a need to refine the intervention procedures for CAS and/or a need to identify appropriate candidates for intervention to optimise learning.

  1. Evaluating the effects of generalisation approaches and DEM ...

    African Journals Online (AJOL)

    Digital elevation model (DEM) data are elemental in deriving primary topographic attributes which serve as input variables to a variety of hydrologic and geomorphologic studies. There is however still varied consensus on the effect of DEM source and resolution on the application of these topographic attributes to landscape ...

  2. Generalised model-independent characterisation of strong gravitational lenses. I. Theoretical foundations

    Science.gov (United States)

    Wagner, J.

    2017-05-01

    We extend our model-independent approach for characterising strong gravitational lenses to its most general form to leading order and use the orientation angles of a set of multiple images with respect to their connection line(s) in addition to the relative distances between the images, their ellipticities, and time-delays. For two symmetric images that straddle the critical curve, the orientation angle additionally allows us to determine the slope of the critical curve and a second (reduced) flexion coefficient at the critical point on the connection line between the images. It also allows us to drop the symmetry assumption that the axis of largest image extension is orthogonal to the critical curve. For three images almost forming a giant arc, the degree of assumed image symmetry is also reduced to the most general case, describing image configurations for which the source need not be placed on the symmetry axis of the two folds that unite at the cusp. For a given set of multiple images, we set limits on the applicability of our approach, show which information can be obtained in cases of merging images, and analyse the accuracy achievable due to the Taylor expansion of the lensing potential for the fold case on a galaxy cluster scale Navarro-Frenk-White-profile, a fold and cusp case on a galaxy cluster scale singular isothermal ellipse, and compare the generalised approach with our previously published one. The position of the critical points is reconstructed with less than 5'' deviation for multiple images closer to the critical points than 30% of the (effective) Einstein radius. The slope of the critical curve at a fold and its shape in the vicinity of a cusp deviate less than 20% from the true values for distances of the images to the critical points less than 15% of the (effective) Einstein radius.

  3. The M/V Cosco Busan spill: source identification and short-term fate.

    Science.gov (United States)

    Lemkau, Karin L; Peacock, Emily E; Nelson, Robert K; Ventura, G Todd; Kovecses, Jennifer L; Reddy, Christopher M

    2010-11-01

    Understanding the fate of heavy fuel oils (HFOs) in the environment is critical for sound decisions regarding its usage and spill cleanup. To study weathering of HFOs, we examined the M/V Cosco Busan spill (November 2007; San Francisco Bay, CA, USA). In this baseline report, we identified which ruptured tank (port tank 3 or 4) was the source of the spilled oil and characterized changes in the oil composition across location and time. Samples from three impacted shorelines, collected within 80 days of the spill, were analyzed using one- and two-dimensional gas chromatography (GC and GC × GC, respectively). Weathering varied across sites, but compounds with GC retention times less than n-C(16) were generally lost by evaporation and dissolution. Changes in n-C(18)/phytane and benz[a]anthracene/chrysene ratios indicated some biodegradation and photodegradation, respectively. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Analysis of the different source terms of natural radionuclides in a river affected by NORM (Naturally Occurring Radioactive Materials) activities.

    Science.gov (United States)

    Baeza, A; Corbacho, J A; Guillén, J; Salas, A; Mora, J C

    2011-05-01

    The present work studied the radioacitivity impact of a coal-fired power plant (CFPP), a NORM industry, on the water of the Regallo river which the plant uses for cooling. Downstream, this river passes through an important irrigated farming area, and it is a tributary of the Ebro, one of Spain's largest rivers. Although no alteration of the (210)Po or (232)Th content was detected, the (234,238)U and (226)Ra contents of the water were significantly greater immediately below CFPP's discharge point. The (226)Ra concentration decreased progressively downstream from the discharge point, but the uranium content increased significantly again at two sampling points 8 km downstream from the CFPP's effluent. This suggested the presence of another, unexpected uranium source term different from the CFPP. The input from this second uranium source term was even greater than that from the CFPP. Different hypotheses were tested (a reservoir used for irrigation, remobilization from sediments, and the effect of fertilizers used in the area), with it finally being demonstrated that the source was the fertilizers used in the adjacent farming areas. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Status report on the long-term stability of the Advanced Photon Source

    International Nuclear Information System (INIS)

    Friedsam, H.

    1998-01-01

    Table 1 summarizes the average elevation changes and standard deviations as well as the points with the largest changes for each year. On average, hardly any settlements can be detected; however, local changes of +2.90 mm to -2.31 mm have been measured. Looking at the low and high points, the settlement process is slowing down over time. Overall, the settlements observed match the expectations for this type of construction. To date no major realignment of the Advanced Photon Source (APS) storage ring has been necessary. The particle beam tracks with the settlements of the floor as long as these changes occur in a smooth fashion and not as sudden discontinuities [5]. From Figures 6 through 8 it is also apparent that settlements affect larger areas in the storage ring and experiment hall that impact the location of the source point as well as the location of the beamline user equipment. The limiting apertures of the insertion device chambers will make realignment of the APS storage ring a necessity at some point in the future. Currently simulations and machine studies we underway to provide an estimate of tolerable settlement limits before a realignment of certain sections of the storage ring would be required. In conclusion, the APS has been constructed on solid ground with an excellent foundation. Only small settlement changes are being observed; so far they are not impacting the operation of the accelerator. We are continuing to monitor deformations of the APS floor in anticipation of a future realignment of the accelerator components

  6. Calculation of nuclear reactivity using the generalised Adams-Bashforth-Moulton predictor corrector method

    Energy Technology Data Exchange (ETDEWEB)

    Suescun-Diaz, Daniel [Surcolombiana Univ., Neiva (Colombia). Groupo de Fisica Teorica; Narvaez-Paredes, Mauricio [Javeriana Univ., Cali (Colombia). Groupo de Matematica y Estadistica Aplicada Pontificia; Lozano-Parada, Jamie H. [Univ. del Valle, Cali (Colombia). Dept. de Ingenieria

    2016-03-15

    In this paper, the generalisation of the 4th-order Adams-Bashforth-Moulton predictor-corrector method is proposed to numerically solve the point kinetic equations of the nuclear reactivity calculations without using the nuclear power history. Due to the nature of the point kinetic equations, different predictor modifiers are used in order improve the precision of the approximations obtained. The results obtained with the prediction formulas and generalised corrections improve the precision when compared with previous methods and are valid for various forms of nuclear power and different time steps.

  7. The use of nonlinear regression analysis for integrating pollutant concentration measurements with atmospheric dispersion modeling for source term estimation

    International Nuclear Information System (INIS)

    Edwards, L.L.; Freis, R.P.; Peters, L.G.; Gudiksen, P.H.; Pitovranov, S.E.

    1993-01-01

    The accuracy associated with assessing the environmental consequences of an accidental release of radioactivity is highly dependent on the knowledge of the source term characteristics, which are generally poorly known. The development of an automated numerical technique that integrates the radiological measurements with atmospheric dispersion modeling for more accurate source term estimation is reported. Often, this process of parameter estimation is performed by an emergency response assessor, who takes an intelligent first guess at the model parameters, then, comparing the model results with whatever measurements are available, makes an intuitive, informed next guess of the model parameters. This process may be repeated any number of times until the assessor feels that the model results are reasonable in terms of the measured observations. A new approach, based on a nonlinear least-squares regression scheme coupled with the existing Atmospheric Release Advisory Capability three-dimensional atmospheric dispersion models, is to supplement the assessor's intuition with automated mathematical methods that do not significantly increase the response time of the existing predictive models. The viability of the approach is evaluated by estimation of the known SF 6 tracer release rates associated with the Mesoscale Atmospheric Transport Studies tracer experiments conducted at the Savannah River Laboratory during 1983. These 19 experiments resulted in 14 successful, separate tracer releases with sampling of the tracer plumes along the cross-plume arc situated ∼30 km from the release site

  8. Generalised bottom-up holography and walking technicolour

    DEFF Research Database (Denmark)

    D. Dietrich, Dennis; Kouvaris, Christoforos

    2009-01-01

    In extradimensional holographic approaches the flavour symmetry is gauged in the bulk, that is, treated as a local symmetry. Imposing such a local symmetry admits fewer terms coupling the (axial) vectors and (pseudo)scalars than if a global symmetry is imposed. The latter is the case in standard ...

  9. A generalised groundwater flow equation using the concept of non ...

    African Journals Online (AJOL)

    2006-01-01

    Jan 1, 2006 ... K the hydraulic conductivity tensor of the aquifer. Ф(x,t) the piezometric head f(x,t) the strength of any sources or sinks, with x and t the usual spatial and time coordinates. V the gradient operator. ∂t the time derivative. This model showed that the dominant flow field in these aquifers is vertical and linear and ...

  10. Identifying Patterns in the Weather of Europe for Source Term Estimation

    Science.gov (United States)

    Klampanos, Iraklis; Pappas, Charalambos; Andronopoulos, Spyros; Davvetas, Athanasios; Ikonomopoulos, Andreas; Karkaletsis, Vangelis

    2017-04-01

    During emergencies that involve the release of hazardous substances into the atmosphere the potential health effects on the human population and the environment are of primary concern. Such events have occurred in the past, most notably involving radioactive and toxic substances. Examples of radioactive release events include the Chernobyl accident in 1986, as well as the more recent Fukushima Daiichi accident in 2011. Often, the release of dangerous substances in the atmosphere is detected at locations different from the release origin. The objective of this work is the rapid estimation of such unknown sources shortly after the detection of dangerous substances in the atmosphere, with an initial focus on nuclear or radiological releases. Typically, after the detection of a radioactive substance in the atmosphere indicating the occurrence of an unknown release, the source location is estimated via inverse modelling. However, depending on factors such as the spatial resolution desired, traditional inverse modelling can be computationally time-consuming. This is especially true for cases where complex topography and weather conditions are involved and can therefore be problematic when timing is critical. Making use of machine learning techniques and the Big Data Europe platform1, our approach moves the bulk of the computation before any such event taking place, therefore allowing for rapid initial, albeit rougher, estimations regarding the source location. Our proposed approach is based on the automatic identification of weather patterns within the European continent. Identifying weather patterns has long been an active research field. Our case is differentiated by the fact that it focuses on plume dispersion patterns and these meteorological variables that affect dispersion the most. For a small set of recurrent weather patterns, we simulate hypothetical radioactive releases from a pre-known set of nuclear reactor locations and for different substance and temporal

  11. Advanced Monte Carlo procedure for the IFMIF d-Li neutron source term based on evaluated cross section data

    International Nuclear Information System (INIS)

    Simakov, S.P.; Fischer, U.; Moellendorff, U. von; Schmuck, I.; Konobeev, A.Yu.; Korovin, Yu.A.; Pereslavtsev, P.

    2002-01-01

    A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ 6,7 Li cross section data. A new code M c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M c DeLicious code was checked against available experimental data and calculation results of M c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M c DeLicious along with newly evaluated d+ 6,7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data

  12. Advanced Monte Carlo procedure for the IFMIF d-Li neutron source term based on evaluated cross section data

    CERN Document Server

    Simakov, S P; Moellendorff, U V; Schmuck, I; Konobeev, A Y; Korovin, Y A; Pereslavtsev, P

    2002-01-01

    A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ sup 6 sup , sup 7 Li cross section data. A new code M sup c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M sup c DeLicious code was checked against available experimental data and calculation results of M sup c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M sup c DeLicious along with newly evaluated d+ sup 6 sup , sup 7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data.

  13. Long-term monitoring of airborne nickel (Ni) pollution in association with some potential source processes in the urban environment.

    Science.gov (United States)

    Kim, Ki-Hyun; Shon, Zang-Ho; Mauulida, Puteri T; Song, Sang-Keun

    2014-09-01

    The environmental behavior and pollution status of nickel (Ni) were investigated in seven major cities in Korea over a 13-year time span (1998-2010). The mean concentrations of Ni measured during the whole study period fell within the range of 3.71 (Gwangju: GJ) to 12.6ngm(-3) (Incheon: IC). Although Ni values showed a good comparability in a relatively large spatial scale, its values in most cities (6 out of 7) were subject to moderate reductions over the study period. To assess the effect of major sources on the long-term distribution of Ni, the relationship between their concentrations and the potent source processes like non-road transportation sources (e.g., ship and aircraft emissions) were examined from some cities with port and airport facilities. The potential impact of long-range transport of Asian dust particles in controlling Ni levels was also evaluated. The overall results suggest that the Ni levels were subject to gradual reductions over the study period irrespective of changes in such localized non-road source activities. The pollution of Ni at all the study sites was maintained well below the international threshold (Directive 2004/107/EC) value of 20ngm(-3). Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Methodology for and uses of a radiological source term assessment for potential impacts to stormwater and groundwater

    International Nuclear Information System (INIS)

    Teare, A.; Hansen, K.; DeWilde, J.; Yu, L.; Killey, D.

    2001-01-01

    A Radiological Source Term Assessment (RSTA) was conducted by Ontario Power Generation Inc. (OPG) at the Pickering Nuclear Generating Station (PNGS). Tritium had been identified in the groundwater at several locations under the station, and OPG initiated the RSTA as part of its ongoing efforts to improve operations and to identify potential sources of radionuclide impact to groundwater and stormwater at the station. The RSTA provides a systematic approach to collecting information and assessing environmental risk for radioactive contaminants based on a ranking system developed for the purpose. This paper provides an overview of the RSTA focusing on the investigative approach and how it was applied. This approach can find application at other generating stations. (author)

  15. Insulin sources and types: a review of insulin in terms of its mode on diabetes mellitus.

    Science.gov (United States)

    Ahmad, Kafeel

    2014-04-01

    Insulin is involved in regulation of glucose utilization in the body. Inability of the body to synthesize insulin of human cells resistance to insulin leads to a condition called Diabetes mellitus which is characterized by chronic hyperglycaemia. There are two types of diabetes; type 1 and type 2. Exogenous supply of insulin is needed consistently for type 1 diabetes treatment and type 2 diabetes also needs to be cured by the exogenous supply of insulin in advance stages of the disease. These sources have been proved very useful to meet the needs of the patients. However, these insulin types are expensive for the large population of patients in the developing countries. Furthermore, the incidence of diabetes is advancing at an alarming rate. Hence production systems with even higher capabilities of production are desired. Therefore, currently plants are being investigated as alternative production systems. Based on the mode of action of insulin various formulations of insulin have been developed that have different onset of action, peak effect and duration of action according to the needs of the patients.

  16. Application of multisorbent traps to characterization and quantification of workplace exposure source terms

    International Nuclear Information System (INIS)

    Dindal, A.B.; Ma, Cheng-Yu; Jenkins, R.A.; Higgins, C.E.; Skeen, J.T.; Bayne, C.K.

    1995-01-01

    Multisorbent traps have been used for several years to characterize complex atmospheres. Only more recently have multisorbent traps been used for quantitative analysis. The traps provide an effective method for retaining a wide range of airborne Organic contaminants, since these carbonaceous sorbents are relatively hydrophobic, have large surface areas, do not have active functional groups, and have fewer chemical artifacts than other sorbents. Multisorbent traps, which are 76 mm in length and have a 6 mm outside diameter, contain sequentially loaded beds of Carbotrap C, Carbotrap, and Carbosieve SIII, similar to a commercially available trap. The injection port of a gas chromatograph is configured for thermal desorption analysis of the traps via an in-house modification. Currently, multisorbent traps are being used to sample the headspace of underground storage tanks at the Department of Energy's Hanford site, in Richland, Washington. The analyses are performed by flame ionization or mass spectrometric detection. Target organic analytes include C 6 to C 13 alkanes, nitriles, alkyl ketones, dibutyl butyl phosphonate and tributyl phosphate. Pre-analytical holding times or practical reporting times for many target analytes are at least 84 days under either refrigerated or ambient conditions. Traps are fabricated, conditioned, and spiked with three surrogate standards in the vapor phase prior to shipment to the site. Recovery of the surrogates from the multisorbent traps serve as a statistical process control. Source concentrations of Hanford underground storage tank headspaces range from 0.96 mg/m 3 to 1200 mg/m 3

  17. Long-term effects of lead poisoning on bone mineralization in vultures exposed to ammunition sources.

    Science.gov (United States)

    Gangoso, Laura; Alvarez-Lloret, Pedro; Rodríguez-Navarro, Alejandro A B; Mateo, Rafael; Hiraldo, Fernando; Donázar, José Antonio

    2009-02-01

    Long-lived species are particularly susceptible to bioaccumulation of lead in bone tissues. In this paper we gain insights into the sublethal effects of lead contamination on Egyptian vultures (Neophron percnopterus). Our approach was done on the comparison of two populations (Canary Islands and Iberian Peninsula) differing in exposures to the ingestion of lead ammunition. Blood lead levels were higher in the island population (Canary Islands range: 5.10-1780 microg L(-1) n=137; Iberian Peninsula range: 5.60-217.30 microg L(-1) n=32) showing clear seasonal trends, peaking during the hunting season. Moreover, males were more susceptible to lead accumulation than females. Bone lead concentration increased with age, reflecting a bioaccumulation effect. The bone composition was significatively altered by this contaminant: the mineralization degree decreased as lead concentration levels increased. These results demonstrate the existence of long-term effects of lead poisoning, which may be of importance in the declines of threatened populations of long-lived species exposed to this contaminant.

  18. Toronto area ozone: Long-term measurements and modeled sources of poor air quality events

    Science.gov (United States)

    Whaley, C. H.; Strong, K.; Jones, D. B. A.; Walker, T. W.; Jiang, Z.; Henze, D. K.; Cooke, M. A.; McLinden, C. A.; Mittermeier, R. L.; Pommier, M.; Fogal, P. F.

    2015-11-01

    The University of Toronto Atmospheric Observatory and Environment Canada's Centre for Atmospheric Research Experiments each has over a decade of ground-based Fourier transform infrared (FTIR) spectroscopy measurements in southern Ontario. We present the Toronto area FTIR time series from 2002 to 2013 of two tropospheric trace gases—ozone and carbon monoxide—along with surface in situ measurements taken by government monitoring programs. We interpret their variability with the GEOS-Chem chemical transport model and determine the atmospheric conditions that cause pollution events in the time series. Our analysis includes a regionally tagged O3 model of the 2004-2007 time period, which quantifies the geographical contributions to Toronto area O3. The important emission types for 15 pollution events are then determined with a high-resolution adjoint model. Toronto O3, during pollution events, is most sensitive to southern Ontario and U.S. fossil fuel NOx emissions and natural isoprene emissions. The sources of Toronto pollution events are found to be highly variable, and this is demonstrated in four case studies representing local, short-, middle-, and long-range transport scenarios. This suggests that continental-scale emission reductions could improve air quality in the Toronto region. We also find that abnormally high temperatures and high-pressure systems are common to all pollution events studied, suggesting that climate change may impact Toronto O3. Finally, we quantitatively compare the sensitivity of the surface and column measurements to anthropogenic NOx emissions and show that they are remarkably similar. This work thus demonstrates the usefulness of FTIR measurements in an urban area to assess air quality.

  19. Size distribution, directional source contributions and pollution status of PM from Chengdu, China during a long-term sampling campaign.

    Science.gov (United States)

    Shi, Guo-Liang; Tian, Ying-Ze; Ma, Tong; Song, Dan-Lin; Zhou, Lai-Dong; Han, Bo; Feng, Yin-Chang; Russell, Armistead G

    2017-06-01

    Long-term and synchronous monitoring of PM 10 and PM 2.5 was conducted in Chengdu in China from 2007 to 2013. The levels, variations, compositions and size distributions were investigated. The sources were quantified by two-way and three-way receptor models (PMF2, ME2-2way and ME2-3way). Consistent results were found: the primary source categories contributed 63.4% (PMF2), 64.8% (ME2-2way) and 66.8% (ME2-3way) to PM 10 , and contributed 60.9% (PMF2), 65.5% (ME2-2way) and 61.0% (ME2-3way) to PM 2.5 . Secondary sources contributed 31.8% (PMF2), 32.9% (ME2-2way) and 31.7% (ME2-3way) to PM 10 , and 35.0% (PMF2), 33.8% (ME2-2way) and 36.0% (ME2-3way) to PM 2.5 . The size distribution of source categories was estimated better by the ME2-3way method. The three-way model can simultaneously consider chemical species, temporal variability and PM sizes, while a two-way model independently computes datasets of different sizes. A method called source directional apportionment (SDA) was employed to quantify the contributions from various directions for each source category. Crustal dust from east-north-east (ENE) contributed the highest to both PM 10 (12.7%) and PM 2.5 (9.7%) in Chengdu, followed by the crustal dust from south-east (SE) for PM 10 (9.8%) and secondary nitrate & secondary organic carbon from ENE for PM 2.5 (9.6%). Source contributions from different directions are associated with meteorological conditions, source locations and emission patterns during the sampling period. These findings and methods provide useful tools to better understand PM pollution status and to develop effective pollution control strategies. Copyright © 2016. Published by Elsevier B.V.

  20. Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Directory of Open Access Journals (Sweden)

    A. S. Candy

    2018-01-01

    Full Text Available The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model- or application-specific, and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, to ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions that enable full accounts of provenance, sharing, and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed that is automated, robust and repeatable, quick-to-draft, rigorously verified, and consistent with the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics. Library code, verification tests, and examples available in the repository at https://github.com/shingleproject/Shingle. Further details of the project presented at http://shingleproject.org.

  1. Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Science.gov (United States)

    Candy, Adam S.; Pietrzak, Julie D.

    2018-01-01

    The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model- or application-specific, and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, to ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions that enable full accounts of provenance, sharing, and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed that is automated, robust and repeatable, quick-to-draft, rigorously verified, and consistent with the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics. Library code, verification tests, and examples available in the repository at https://github.com/shingleproject/Shingle. Further details of the project presented at http://shingleproject.org.

  2. Sources and contents of air pollution affecting term low birth weight in Los Angeles County, California, 2001-2008.

    Science.gov (United States)

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Cockburn, Myles; Escobedo, Loraine; Kleeman, Michael J; Wu, Jun

    2014-10-01

    Low birth weight (LBW, pollution, but it is still unclear which sources or components of air pollution might be in play. The association between ultrafine particles and LBW has never been studied. To study the relationships between LBW in term born infants and exposure to particles by size fraction, source and chemical composition, and complementary components of air pollution in Los Angeles County (California, USA) over the period 2001-2008. Birth certificates (n=960,945) were geocoded to maternal residence. Primary particulate matter (PM) concentrations by source and composition were modeled. Measured fine PM, nitrogen dioxide and ozone concentrations were interpolated using empirical Bayesian kriging. Traffic indices were estimated. Associations between LBW and air pollution metrics were examined using generalized additive models, adjusting for maternal age, parity, race/ethnicity, education, neighborhood income, gestational age and infant sex. Increased LBW risks were associated with the mass of primary fine and ultrafine PM, with several major sources (especially gasoline, wood burning and commercial meat cooking) of primary PM, and chemical species in primary PM (elemental and organic carbon, potassium, iron, chromium, nickel, and titanium but not lead or arsenic). Increased LBW risks were also associated with total fine PM mass, nitrogen dioxide and local traffic indices (especially within 50 m from home), but not with ozone. Stronger associations were observed in infants born to women with low socioeconomic status, chronic hypertension, diabetes and a high body mass index. This study supports previously reported associations between traffic-related pollutants and LBW and suggests other pollution sources and components, including ultrafine particles, as possible risk factors. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Neural system for updating object working memory from different sources: sensory stimuli or long-term memory.

    Science.gov (United States)

    Roth, Jennifer K; Courtney, Susan M

    2007-11-15

    Working memory (WM) is the active maintenance of currently relevant information so that it is available for use. A crucial component of WM is the ability to update the contents when new information becomes more relevant than previously maintained information. New information can come from different sources, including from sensory stimuli (SS) or from long-term memory (LTM). Updating WM may involve a single neural system regardless of source, distinct systems for each source, or a common network with additional regions involved specifically in sensory or LTM processes. The current series of experiments indicates that a single fronto-parietal network (including supplementary motor area, parietal, left inferior frontal junction, middle frontal gyrus) is active in updating WM regardless of the source of information. Bilateral cuneus was more active during updating WM from LTM than updating from SS, but the activity in this region was attributable to recalling information from LTM regardless of whether that information was to be entered into WM for future use or not. No regions were found to be more active during updating from SS than updating from LTM. Functional connectivity analysis revealed that different regions within this common update network were differentially more correlated with visual processing regions when participants updated from SS, and more correlated with LTM processing regions when participants updated from the contents of LTM. These results suggest that a single neural mechanism is responsible for controlling the contents of WM regardless of whether that information originates from a sensory stimulus or from LTM. This network of regions involved in updating WM interacts with the rest of the brain differently depending on the source of newly relevant information.

  4. Processing bias in children with separation anxiety disorder, social phobia and generalised anxiety disorder

    NARCIS (Netherlands)

    Kindt, M.; Bögels, S.M.; Morren, M.

    2003-01-01

    The present study examined processing bias in children suffering from anxiety disorders. Processing bias was assessed using of the emotional Stroop task in clinically referred children with separation anxiety disorder (SAD), social phobia (SP), and/or generalised anxiety disorder (GAD) and normal

  5. Specificity of dysfunctional thinking in children with symptoms of social anxiety, separation anxiety and generalised anxiety

    NARCIS (Netherlands)

    Bogels, S.M.; Snieder, N.; Kindt, M.

    2003-01-01

    The present study investigated whether children with high symptom levels of either social phobia (SP), separation anxiety disorder (SAD), or generalised anxiety disorder (GAD) are characterised by a specific set of dysfunctional interpretations that are consistent with the cognitive model of their

  6. [Epileptic seizures during childbirth in a patient with idiopathic generalised epilepsy

    NARCIS (Netherlands)

    Voermans, N.C.; Zwarts, M.J.; Renier, W.O.; Bloem, B.R.

    2005-01-01

    During her first pregnancy, a 37-year-old woman with idiopathic generalised epilepsy that was adequately controlled with lamotrigine experienced a series of epileptic seizures following an elective caesarean section. The attacks were terminated with diazepam. The following day, she developed

  7. Generalised Partially Linear Regression with Misclassified Data and an Application to Labour Market Transitions

    DEFF Research Database (Denmark)

    Dlugosz, Stephan; Mammen, Enno; Wilke, Ralf

    We consider the semiparametric generalised linear regression model which has mainstream empirical models such as the (partially) linear mean regression, logistic and multinomial regression as special cases. As an extension to related literature we allow a misclassified covariate to be interacted...

  8. Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation

    Science.gov (United States)

    Papadopoulos, Ioannis; Iatridou, Maria

    2010-01-01

    This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…

  9. Multi-Trial Guruswami–Sudan Decoding for Generalised Reed–Solomon Codes

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde; Zeh, Alexander

    2013-01-01

    An iterated refinement procedure for the Guruswami–Sudan list decoding algorithm for Generalised Reed–Solomon codes based on Alekhnovich’s module minimisation is proposed. The method is parametrisable and allows variants of the usual list decoding approach. In particular, finding the list...

  10. Total and Differential Leukocyte Counts in the Peripheral Blood of Patients with Generalised Aggressive Periodontitis.

    Science.gov (United States)

    Anand, Pradeep S; Sagar, Deepak Kumar; Mishra, Supriya; Narang, Sumit; Kamath, Kavitha P; Anil, Sukumaran

    To compare the total and differential leukocyte counts in the peripheral blood of generalised aggressive periodontitis patients with that of periodontally healthy subjects in a central Indian population. Seventy-five patients with generalised aggressive periodontitis and 63 periodontally healthy subjects were enrolled for the purpose of the study. All participants received a full-mouth periodontal examination in which probing depth and clinical attachment level were recorded. The haematological variables analysed included total leukocyte count, neutrophil count, lymphocyte count, monocyte count, neutrophil percentage, lymphocyte percentage, monocyte percentage and platelet count. The patient group showed a significantly higher total leukocyte count (7.62 ± 1.70 x 109 cells/l, p = 0.008), neutrophil count (5.06 ± 1.47x109 cells/l, p aggressive periodontitis and elevated total leukocyte (p = 0.012) and neutrophil counts (p = 0.001). The findings of the present study suggest that patients with generalised aggressive periodontitis might also demonstrate a systemic inflammatory response, as evidenced by increased leukocyte counts. This systemic inflammatory response observed in patients with generalised aggressive periodontitis may be associated with an increased risk for cardiovascular diseases.

  11. Acute generalised exanthematous pustulosis induced by Pneumocystis jirovecii pneumonia prophylaxis with dapsone.

    Science.gov (United States)

    Vas, A; Laws, P; Marsland, Am; McQuillan, O

    2013-09-01

    We describe the case of HIV-1 infected patient presenting to hospital with a severe cutaneous adverse drug reaction shortly after commencing dapsone therapy as Pneumocystis jirovecii pneumonia prophylaxis. To the best of our knowledge, acute generalised exanthematous pustulosis has not been reported as a reaction to dapsone in the setting of HIV.

  12. Issues in the Analysis of Focus Groups: Generalisability, Quantifiability, Treatment of Context and Quotations

    Science.gov (United States)

    Vicsek, Lilla

    2010-01-01

    In this paper I discuss some concerns related to the analysis of focus groups: (a) the issue of generalisation; (b) the problems of using numbers and quantifying in the analysis; (c) how the concrete situation of the focus groups could be included in the analysis, and (d) what formats can be used when quoting from focus groups. Problems with…

  13. Generalised Multi-sequence Shift-Register Synthesis using Module Minimisation

    DEFF Research Database (Denmark)

    Nielsen, Johan Sebastian Rosenkilde

    2013-01-01

    We show how to solve a generalised version of the Multi-sequence Linear Feedback Shift-Register (MLFSR) problem using minimisation of free modules over F[x]. We show how two existing algorithms for minimising such modules run particularly fast on these instances. Furthermore, we show how one...

  14. A retrospective study of carbamazepine therapy in the treatment of idiopathic generalised epilepsy

    LENUS (Irish Health Repository)

    O'Connor, G

    2011-05-01

    Objective: The exacerbation of idiopathic generalised epilepsy (IGE) by some anti-epileptic drugs (AEDs) such as carbamazepine (CBZ) has been well documented. However, it is unclear whether IGE is always worsened by the use of CBZ, or whether some patients with IGE benefit from its use. \\r\

  15. An Early Algebra Approach to Pattern Generalisation: Actualising the Virtual through Words, Gestures and Toilet Paper

    Science.gov (United States)

    Ferrara, Francesca; Sinclair, Nathalie

    2016-01-01

    This paper focuses on pattern generalisation as a way to introduce young students to early algebra. We build on research on patterning activities that feature, in their work with algebraic thinking, both looking for sameness recursively in a pattern (especially figural patterns, but also numerical ones) and conjecturing about function-based…

  16. Toward a Mechanistic Source Term in Advanced Reactors: A Review of Past U.S. SFR Incidents, Experiments, and Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Brunett, Acacia J.; Grabaskas, David

    2016-04-17

    In 2015, as part of a Regulatory Technology Development Plan (RTDP) effort for sodium-cooled fast reactors (SFRs), Argonne National Laboratory investigated the current state of knowledge of source term development for a metal-fueled, pool-type SFR. This paper provides a summary of past domestic metal-fueled SFR incidents and experiments and highlights information relevant to source term estimations that were gathered as part of the RTDP effort. The incidents described in this paper include fuel pin failures at the Sodium Reactor Experiment (SRE) facility in July of 1959, the Fermi I meltdown that occurred in October of 1966, and the repeated melting of a fuel element within an experimental capsule at the Experimental Breeder Reactor II (EBR-II) from November 1967 to May 1968. The experiments described in this paper include the Run-Beyond-Cladding-Breach tests that were performed at EBR-II in 1985 and a series of severe transient overpower tests conducted at the Transient Reactor Test Facility (TREAT) in the mid-1980s.

  17. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  18. Source Term Analysis for the Nuclear Power Station Goesgen-Daeniken; Quelltermanalysen fuer das Kernkraftwerk Goesgen-Daeniken

    Energy Technology Data Exchange (ETDEWEB)

    Hosemann, J.P.; Megaritis, G.; Guentay, S.; Hirschmann, H.; Luebbesmeyer, D.; Lieber, K.; Jaeckel, B.; Birchley, J.; Duijvestijn, G

    2001-08-01

    Analyses are performed for three accident scenarios postulated to occur in the Goesgen Nuclear Power Plant, a 900 MWe Pressurised Water Reactor of Siemens design. The scenarios investigated comprise a Station Blackout and two separate cases of small break loss-of-coolant accident which lead, respectively, to high, intermediate and low pressure conditions in the reactor system. In each case the accident assumptions are highly pessimistic, so that the sequences span a large range of plant states and a damage phenomena. Thus the plant is evaluated for a diversity of potential safety challenges. A suite of analysis tools are used to examine the reactor coolant system response, the core heat-up, melting, fission product release from the reactor system, the transport and chemical behaviour of those fission products in the containment building, and the release of radioactivity (source term) to the environment. Comparison with reference values used by the licensing authority shows that the use of modern analysis tools and current knowledge can provide substantial reduction in the estimated source term. Of particular interest are insights gained from the analyses which indicate opportunities for operators to reduce or forestall the release. (author)

  19. A risk-based evaluation of the impact of key uncertainties on the prediction of severe accident source terms - STU

    International Nuclear Information System (INIS)

    Ang, M.L.; Grindon, E.; Dutton, L.M.C.; Garcia-Sedano, P.; Santamaria, C.S.; Centner, B.; Auglaire, M.; Routamo, T.; Outa, S.; Jokiniemi, J.; Gustavsson, V.; Wennerstrom, H.; Spanier, L.; Gren, M.; Boschiero, M-H; Droulas, J-L; Friederichs, H-G; Sonnenkalb, M.

    2001-01-01

    The purpose of this project is to address the key uncertainties associated with a number of fission product release and transport phenomena in a wider context and to assess their relevance to key severe accident sequences. This project is a wide-based analysis involving eight reactor designs that are representative of the reactors currently operating in the European Union (EU). In total, 20 accident sequences covering a wide range of conditions have been chosen to provide the basis for sensitivity studies. The appraisal is achieved through a systematic risk-based framework developed within this project. Specifically, this is a quantitative interpretation of the sensitivity calculations on the basis of 'significance indicators', applied above defined threshold values. These threshold values represent a good surrogate for 'large release', which is defined in a number of EU countries. In addition, the results are placed in the context of in-containment source term limits, for advanced light water reactor designs, as defined by international guidelines. Overall, despite the phenomenological uncertainties, the predicted source terms (both into the containment, and subsequently, into the environment) do not display a high degree of sensitivity to the individual fission product issues addressed in this project. This is due, mainly, to the substantial capacity for the attenuation of airborne fission products by the designed safety provisions and the natural fission product retention mechanisms within the containment

  20. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  1. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    Science.gov (United States)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  2. Application of the conditional source-term estimation model for turbulence-chemistry interactions in a premixed flame

    Science.gov (United States)

    Salehi, M. M.; Bushe, W. K.; Daun, K. J.

    2012-04-01

    Conditional Source-term Estimation (CSE) is a closure model for turbulence-chemistry interactions. This model uses the first-order CMC hypothesis to close the chemical reaction source terms. The conditional scalar field is estimated by solving an integral equation using inverse methods. It was originally developed and has been used extensively in non-premixed combustion. This work is the first application of this combustion model for a premixed flame. CSE is coupled with a Trajectory Generated Low-Dimensional Manifold (TGLDM) model for chemistry. The CSE-TGLDM combustion model is used in a RANS code to simulate a turbulent premixed Bunsen burner. Along with this combustion model, a similar model which relies on the flamelet assumption is also used for comparison. The results of these two approaches in the prediction of the velocity field, temperature and species mass fractions are compared together. Although the flamelet model is less computationally expensive, the CSE combustion model is more general and does not have the limiting assumption underlying the flamelet model.

  3. How evolution learns to generalise: Using the principles of learning theory to understand the evolution of developmental organisation.

    Directory of Open Access Journals (Sweden)

    Kostas Kouvaris

    2017-04-01

    Full Text Available One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting 'quick fixes' (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability. Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity enhance evolvability. This equivalence provides access to a well-developed theoretical

  4. A bootstrap method to avoid the effect of concurvity in generalised additive models in time series studies of air pollution.

    Science.gov (United States)

    Figueiras, Adolfo; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen

    2005-10-01

    In recent years a great number of studies have applied generalised additive models (GAMs) to time series data to estimate the short term health effects of air pollution. Lately, however, it has been found that concurvity--the non-parametric analogue of multicollinearity--might lead to underestimation of standard errors of the effects of independent variables. Underestimation of standard errors means that for concurvity levels commonly present in the data, the risk of committing type I error rises by over threefold. This study developed a conditional bootstrap methology that consists of assuming that the outcome in any observation is conditional upon the values of the set of independent variables used. It then tested this procedure by means of a simulation study using a Poisson additive model. The response variable of this model is a function of an unobserved confounding variable (that introduces trend and seasonality), real black smoke data, and temperature. Scenarios were created with different coefficients and degrees of concurvity. Conditional bootstrap provides confidence intervals with coverages close to nominal (95%), irrespective of the degree of concurvity, number of variables in the model or magnitude of the coefficient to be estimated (for example, for a concurvity of 0.85, bootstrap confidence interval coverage is 95% compared with 71% in the case of the asymptotic interval obtained directly with S-plus gam function). The bootstrap method avoids the problem of concurvity in time series studies of air pollution, and is easily generalised to non-linear dose-risk effects. All bootstrap calculations described in this paper can be performed using S-Plus gam.boot software.

  5. How evolution learns to generalise: Using the principles of learning theory to understand the evolution of developmental organisation.

    Science.gov (United States)

    Kouvaris, Kostas; Clune, Jeff; Kounios, Loizos; Brede, Markus; Watson, Richard A

    2017-04-01

    One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting 'quick fixes' (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability). Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity) enhance evolvability. This equivalence provides access to a well-developed theoretical framework from

  6. How evolution learns to generalise: Using the principles of learning theory to understand the evolution of developmental organisation

    Science.gov (United States)

    Kouvaris, Kostas; Clune, Jeff; Brede, Markus; Watson, Richard A.

    2017-01-01

    One of the most intriguing questions in evolution is how organisms exhibit suitable phenotypic variation to rapidly adapt in novel selective environments. Such variability is crucial for evolvability, but poorly understood. In particular, how can natural selection favour developmental organisations that facilitate adaptive evolution in previously unseen environments? Such a capacity suggests foresight that is incompatible with the short-sighted concept of natural selection. A potential resolution is provided by the idea that evolution may discover and exploit information not only about the particular phenotypes selected in the past, but their underlying structural regularities: new phenotypes, with the same underlying regularities, but novel particulars, may then be useful in new environments. If true, we still need to understand the conditions in which natural selection will discover such deep regularities rather than exploiting ‘quick fixes’ (i.e., fixes that provide adaptive phenotypes in the short term, but limit future evolvability). Here we argue that the ability of evolution to discover such regularities is formally analogous to learning principles, familiar in humans and machines, that enable generalisation from past experience. Conversely, natural selection that fails to enhance evolvability is directly analogous to the learning problem of over-fitting and the subsequent failure to generalise. We support the conclusion that evolving systems and learning systems are different instantiations of the same algorithmic principles by showing that existing results from the learning domain can be transferred to the evolution domain. Specifically, we show that conditions that alleviate over-fitting in learning systems successfully predict which biological conditions (e.g., environmental variation, regularity, noise or a pressure for developmental simplicity) enhance evolvability. This equivalence provides access to a well-developed theoretical framework from

  7. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 98: Frenchman Flat Nevada Test Site, Nevada, Rev. No.: 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene

    2005-09-01

    Frenchman Flat is one of several areas of the Nevada Test Site (NTS) used for underground nuclear testing (Figure 1-1). These nuclear tests resulted in groundwater contamination in the vicinity of the underground test areas. As a result, the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) is currently conducting a corrective action investigation (CAI) of the Frenchman Flat underground test areas. Since 1996, the Nevada Division of Environmental Protection (NDEP) has regulated NNSA/NSO corrective actions through the ''Federal Facility Agreement and Consent Order'' ([FFACO], 1996). Appendix VI of the FFACO agreement, ''Corrective Action Strategy'', was revised on December 7, 2000, and describes the processes that will be used to complete corrective actions, including those in the Underground Test Area (UGTA) Project. The individual locations covered by the agreement are known as corrective action sites (CASs), which are grouped into corrective action units (CAUs). The UGTA CASs are grouped geographically into five CAUs: Frenchman Flat, Central Pahute Mesa, Western Pahute Mesa, Yucca Flat/Climax Mine, and Rainier Mesa/Shoshone Mountain (Figure 1-1). These CAUs have distinctly different contaminant source, geologic, and hydrogeologic characteristics related to their location (FFACO, 1996). The Frenchman Flat CAU consists of 10 CASs located in the northern part of Area 5 and the southern part of Area 11 (Figure 1-1). This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for Frenchman Flat, CAU 98. The methodology used to estimate hydrologic source terms (HSTs) for the Frenchman Flat CAU is also documented. The HST of an underground nuclear test is the portion of the total inventory of radionuclides that is released over time into the groundwater following the test. The total residual inventory

  8. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 98: Frenchman Flat Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    Farnham, Irene

    2005-01-01

    Frenchman Flat is one of several areas of the Nevada Test Site (NTS) used for underground nuclear testing (Figure 1-1). These nuclear tests resulted in groundwater contamination in the vicinity of the underground test areas. As a result, the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) is currently conducting a corrective action investigation (CAI) of the Frenchman Flat underground test areas. Since 1996, the Nevada Division of Environmental Protection (NDEP) has regulated NNSA/NSO corrective actions through the ''Federal Facility Agreement and Consent Order'' ([FFACO], 1996). Appendix VI of the FFACO agreement, ''Corrective Action Strategy'', was revised on December 7, 2000, and describes the processes that will be used to complete corrective actions, including those in the Underground Test Area (UGTA) Project. The individual locations covered by the agreement are known as corrective action sites (CASs), which are grouped into corrective action units (CAUs). The UGTA CASs are grouped geographically into five CAUs: Frenchman Flat, Central Pahute Mesa, Western Pahute Mesa, Yucca Flat/Climax Mine, and Rainier Mesa/Shoshone Mountain (Figure 1-1). These CAUs have distinctly different contaminant source, geologic, and hydrogeologic characteristics related to their location (FFACO, 1996). The Frenchman Flat CAU consists of 10 CASs located in the northern part of Area 5 and the southern part of Area 11 (Figure 1-1). This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for Frenchman Flat, CAU 98. The methodology used to estimate hydrologic source terms (HSTs) for the Frenchman Flat CAU is also documented. The HST of an underground nuclear test is the portion of the total inventory of radionuclides that is released over time into the groundwater following the test. The total residual inventory of radionuclides associated with one or

  9. Development of computer-based function to estimate radioactive source term by coupling atmospheric model with monitoring data

    International Nuclear Information System (INIS)

    Akiko, Furuno; Hideyuki, Kitabata

    2003-01-01

    Full text: The importance of computer-based decision support systems for local and regional scale accidents has been recognized by many countries with the experiences of accidental atmospheric releases of radionuclides at Chernobyl in 1986 in the former Soviet Union. The recent increase of nuclear power plants in the Asian region also necessitates an emergency response system for Japan to predict the long-range atmospheric dispersion of radionuclides due to overseas accident. On the basis of these backgrounds, WSPEEDI (Worldwide version of System for Prediction of Environmental Emergency Dose Information) at Japan Atomic Energy Research Institute is developed to forecast long-range atmospheric dispersions of radionuclides during nuclear emergency. Although the source condition is critical parameter for accurate prediction, it is rarely that the condition can be acquired in the early stage of overseas accident. Thus, we have been developing a computer-based function to estimate radioactive source term, e.g. the release point, time and amount, as a part of WSPEEDI. This function consists of atmospheric transport simulations and statistical analysis for the prediction and monitoring of air dose rates. Atmospheric transport simulations are carried out for the matrix of possible release points in Eastern Asia and possible release times. The simulation results of air dose rates are compared with monitoring data and the best fitted release condition is defined as source term. This paper describes the source term estimation method and the application to Eastern Asia. The latest version of WSPEEDI accommodates following two models: an atmospheric meteorological model MM5 and a particle random walk model GEARN. MM5 is a non-hydrostatic meteorological model developed by the Pennsylvania State University and the National Center for Atmospheric Research (NCAR). MM5 physically calculates more than 40 meteorological parameters with high resolution in time and space based an

  10. Accident source terms for pressurized water reactors with high-burnup cores calculated using MELCOR 1.8.5.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Powers, Dana Auburn; Ashbaugh, Scott G.; Leonard, Mark Thomas; Longmire, Pamela

    2010-04-01

    In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in this study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs2MoO4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU

  11. Evaluation of long-term community recovery from Hurricane Andrew: sources of assistance received by population sub-groups.

    Science.gov (United States)

    McDonnell, S; Troiano, R P; Barker, N; Noji, E; Hlady, W G; Hopkins, R

    1995-12-01

    Two three-stage cluster surveys were conducted in South Dade County, Florida, 14 months apart, to assess recovery following Hurricane Andrew. Response rates were 75 per cent and 84 per cent. Sources of assistance used in recovery from Hurricane Andrew differed according to race, per capita income, ethnicity, and education. Reports of improved living situation post-hurricane were not associated with receiving relief assistance, but reports of a worse situation were associated with loss of income, being exploited, or job loss. The number of households reporting problems with crime and community violence doubled between the two surveys. Disaster relief efforts had less impact on subjective long-term recovery than did job or income loss or housing repair difficulties. Existing sources of assistance were used more often than specific post-hurricane relief resources. The demographic make-up of a community may determine which are the most effective means to inform them after a disaster and what sources of assistance may be useful.

  12. 135Cs/137Cs isotopic composition of environmental samples across Europe: Environmental transport and source term emission applications

    International Nuclear Information System (INIS)

    Snow, Mathew S.; Snyder, Darin C.

    2016-01-01

    135 Cs/ 137 Cs isotopic analyses represent an important tool for studying the fate and transport of radiocesium in the environment; in this work the 135 Cs/ 137 Cs isotopic composition in environmental samples taken from across Europe is reported. Surface soil and vegetation samples from western Russia, Ukraine, Austria, and Hungary show consistent aged thermal fission product 135 Cs/ 137 Cs isotope ratios of 0.58 ± 0.01 (age corrected to 1/1/15), with the exception of one sample of soil-moss from Hungary which shows an elevated 135 Cs/ 137 Cs ratio of 1.78 ± 0.12. With the exception of the outlier sample from Hungary, surface soil/vegetation data are in quantitative agreement with values previously reported for soils within the Chernobyl exclusion zone, suggesting that radiocesium at these locations is primarily composed of homogenous airborne deposition from Chernobyl. Seawater samples taken from the Irish Sea show 135 Cs/ 137 Cs isotope ratios of 1.22 ± 0.11 (age corrected to 1/1/15), suggesting aged thermal fission product Cs discharged from Sellafield. The differences in 135 Cs/ 137 Cs isotope ratios between Sellafield, Chernobyl, and global nuclear weapons testing fallout indicate that 135 Cs/ 137 Cs isotope ratios can be utilized to discriminate between and track radiocesium transport from different nuclear production source terms, including major emission sources in Europe. - Highlights: • 135 Cs/ 137 Cs useful for tracking anthropogenic environmental radiocesium releases. • European surface soils/vegetation have uniform ratio consistent with Chernobyl. • 135 Cs/ 137 Cs in Irish sea represents thermal fission ratio distinct from Chernobyl. • Can distinguish between major source terms in Europe based on 135 Cs/ 137 Cs.

  13. Assessing the effect of a partly unobserved, exogenous, binary time-dependent covariate on survival probabilities using generalised pseudo-values

    Directory of Open Access Journals (Sweden)

    Ulrike Pötschger

    2018-01-01

    Full Text Available Abstract Background Investigating the impact of a time-dependent intervention on the probability of long-term survival is statistically challenging. A typical example is stem-cell transplantation performed after successful donor identification from registered donors. Here, a suggested simple analysis based on the exogenous donor availability status according to registered donors would allow the estimation and comparison of survival probabilities. As donor search is usually ceased after a patient’s event, donor availability status is incompletely observed, so that this simple comparison is not possible and the waiting time to donor identification needs to be addressed in the analysis to avoid bias. It is methodologically unclear, how to directly address cumulative long-term treatment effects without relying on proportional hazards while avoiding waiting time bias. Methods The pseudo-value regression technique is able to handle the first two issues; a novel generalisation of this technique also avoids waiting time bias. Inverse-probability-of-censoring weighting is used to account for the partly unobserved exogenous covariate donor availability. Results Simulation studies demonstrate unbiasedness and satisfying coverage probabilities of the new method. A real data example demonstrates that study results based on generalised pseudo-values have a clear medical interpretation which supports the clinical decision making process. Conclusions The proposed generalisation of the pseudo-value regression technique enables to compare survival probabilities between two independent groups where group membership becomes known over time and remains partly unknown. Hence, cumulative long-term treatment effects are directly addressed without relying on proportional hazards while avoiding waiting time bias.

  14. Streamlining cardiovascular clinical trials to improve efficiency and generalisability.

    Science.gov (United States)

    Zannad, Faiez; Pfeffer, Marc A; Bhatt, Deepak L; Bonds, Denise E; Borer, Jeffrey S; Calvo-Rojas, Gonzalo; Fiore, Louis; Lund, Lars H; Madigan, David; Maggioni, Aldo Pietro; Meyers, Catherine M; Rosenberg, Yves; Simon, Tabassome; Stough, Wendy Gattis; Zalewski, Andrew; Zariffa, Nevine; Temple, Robert

    2017-08-01

    Controlled trials provide the most valid determination of the efficacy and safety of an intervention, but large cardiovascular clinical trials have become extremely costly and complex, making it difficult to study many important clinical questions. A critical question, and the main objective of this review, is how trials might be simplified while maintaining randomisation to preserve scientific integrity and unbiased efficacy assessments. Experience with alternative approaches is accumulating, specifically with registry-based randomised controlled trials that make use of data already collected. This approach addresses bias concerns while still capitalising on the benefits and efficiencies of a registry. Several completed or ongoing trials illustrate the feasibility of using registry-based controlled trials to answer important questions relevant to daily clinical practice. Randomised trials within healthcare organisation databases may also represent streamlined solutions for some types of investigations, although data quality (endpoint assessment) is likely to be a greater concern in those settings. These approaches are not without challenges, and issues pertaining to informed consent, blinding, data quality and regulatory standards remain to be fully explored. Collaboration among stakeholders is necessary to achieve standards for data management and analysis, to validate large data sources for use in randomised trials, and to re-evaluate ethical standards to encourage research while also ensuring that patients are protected. The rapidly evolving efforts to streamline cardiovascular clinical trials have the potential to lead to major advances in promoting better care and outcomes for patients with cardiovascular disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Generalised form factor dark matter in the Sun

    Energy Technology Data Exchange (ETDEWEB)

    Vincent, Aaron C. [Institute for Particle Physics Phenomenology (IPPP), Department of Physics,Durham University, Durham DH1 3LE (United Kingdom); Serenelli, Aldo [Institut de Ciències de l’Espai (ICE-CSIC/IEEC), Campus UAB,Carrer de Can Magrans s/n, 08193 Cerdanyola del Vallès (Spain); Scott, Pat [Department of Physics, Imperial College London, Blackett Laboratory,Prince Consort Road, London SW7 2AZ (United Kingdom)

    2015-08-19

    We study the effects of energy transport in the Sun by asymmetric dark matter with momentum and velocity-dependent interactions, with an eye to solving the decade-old Solar Abundance Problem. We study effective theories where the dark matter-nucleon scattering cross-section goes as v{sub rel}{sup 2n} and q{sup 2n} with n=−1,0,1 or 2, where v{sub rel} is the dark matter-nucleon relative velocity and q is the momentum exchanged in the collision. Such cross-sections can arise generically as leading terms from the most basic nonstandard DM-quark operators. We employ a high-precision solar simulation code to study the impact on solar neutrino rates, the sound speed profile, convective zone depth, surface helium abundance and small frequency separations. We find that the majority of models that improve agreement with the observed sound speed profile and depth of the convection zone also reduce neutrino fluxes beyond the level that can be reasonably accommodated by measurement and theory errors. However, a few specific points in parameter space yield a significant overall improvement. A 3–5 GeV DM particle with σ{sub SI}∝q{sup 2} is particularly appealing, yielding more than a 6σ improvement with respect to standard solar models, while being allowed by direct detection and collider limits. We provide full analytical capture expressions for q- and v{sub rel}-dependent scattering, as well as complete likelihood tables for all models.

  16. Generalised form factor dark matter in the Sun

    Energy Technology Data Exchange (ETDEWEB)

    Vincent, Aaron C. [Institute for Particle Physics Phenomenology (IPPP), Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Serenelli, Aldo [Institut de Ciències de l' Espai (ICE-CSIC/IEEC), Campus UAB, Carrer de Can Magrans s/n, 08193 Cerdanyola del Vallès Spain (Spain); Scott, Pat, E-mail: aaron.vincent@durham.ac.uk, E-mail: aldos@ice.csic.es, E-mail: p.scott@imperial.ac.uk [Department of Physics, Imperial College London, Blackett Laboratory, Prince Consort Road, London SW7 2AZ (United Kingdom)

    2015-08-01

    We study the effects of energy transport in the Sun by asymmetric dark matter with momentum and velocity-dependent interactions, with an eye to solving the decade-old Solar Abundance Problem. We study effective theories where the dark matter-nucleon scattering cross-section goes as v{sub rel}{sup 2n} and q{sup 2n} with n = −1, 0, 1  or 2, where v{sub rel} is the dark matter-nucleon relative velocity and q is the momentum exchanged in the collision. Such cross-sections can arise generically as leading terms from the most basic nonstandard DM-quark operators. We employ a high-precision solar simulation code to study the impact on solar neutrino rates, the sound speed profile, convective zone depth, surface helium abundance and small frequency separations. We find that the majority of models that improve agreement with the observed sound speed profile and depth of the convection zone also reduce neutrino fluxes beyond the level that can be reasonably accommodated by measurement and theory errors. However, a few specific points in parameter space yield a significant overall improvement. A 3–5 GeV DM particle with σ{sub SI} ∝ q{sup 2} is particularly appealing, yielding more than a 6σ improvement with respect to standard solar models, while being allowed by direct detection and collider limits. We provide full analytical capture expressions for q- and v{sub rel}-dependent scattering, as well as complete likelihood tables for all models.

  17. Inverse modeling of the137Cs source term of the Fukushima Dai-ichi Nuclear Power Plant accident constrained by a deposition map monitored by aircraft.

    Science.gov (United States)

    Yumimoto, Keiya; Morino, Yu; Ohara, Toshimasa; Oura, Yasuji; Ebihara, Mitsuru; Tsuruta, Haruo; Nakajima, Teruyuki

    2016-11-01

    The amount of 137 Cs released by the Fukushima Dai-ichi Nuclear Power Plant accident of 11 March 2011 was inversely estimated by integrating an atmospheric dispersion model, an a priori source term, and map of deposition recorded by aircraft. An a posteriori source term refined finer (hourly) variations comparing with the a priori term, and estimated 137 Cs released 11 March to 2 April to be 8.12 PBq. Although time series of the a posteriori source term was generally similar to those of the a priori source term, notable modifications were found in the periods when the a posteriori source term was well-constrained by the observations. Spatial pattern of 137 Cs deposition with the a posteriori source term showed better agreement with the 137 Cs deposition monitored by aircraft. The a posteriori source term increased 137 Cs deposition in the Naka-dori region (the central part of Fukushima Prefecture) by 32.9%, and considerably improved the underestimated a priori 137 Cs deposition. Observed values of deposition measured at 16 stations and surface atmospheric concentrations collected on a filter tape of suspended particulate matter were used for validation of the a posteriori results. A great improvement was found in surface atmospheric concentration on 15 March; the a posteriori source term reduced root mean square error, normalized mean error, and normalized mean bias by 13.4, 22.3, and 92.0% for the hourly values, respectively. However, limited improvements were observed in some periods and areas due to the difficulty in simulating accurate wind fields and the lack of the observational constraints. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Black Carbon and Sulfate Aerosols in the Arctic: Long-term Trends, Radiative Impacts, and Source Attributions

    Science.gov (United States)

    Wang, H.; Zhang, R.; Yang, Y.; Smith, S.; Rasch, P. J.

    2017-12-01

    The Arctic has warmed dramatically in recent decades. As one of the important short-lived climate forcers, aerosols affect the Arctic radiative budget directly by interfering radiation and indirectly by modifying clouds. Light-absorbing particles (e.g., black carbon) in snow/ice can reduce the surface albedo. The direct radiative impact of aerosols on the Arctic climate can be either warming or cooling, depending on their composition and location, which can further alter the poleward heat transport. Anthropogenic emissions, especially, BC and SO2, have changed drastically in low/mid-latitude source regions in the past few decades. Arctic surface observations at some locations show that BC and sulfate aerosols had a decreasing trend in the recent decades. In order to understand the impact of long-term emission changes on aerosols and their radiative effects, we use the Community Earth System Model (CESM) equipped with an explicit BC and sulfur source-tagging technique to quantify the source-receptor relationships and decadal trends of Arctic sulfate and BC and to identify variations in their atmospheric transport pathways from lower latitudes. The simulation was conducted for 36 years (1979-2014) with prescribed sea surface temperatures and sea ice concentrations. To minimize potential biases in modeled large-scale circulations, wind fields in the simulation are nudged toward an atmospheric reanalysis dataset, while atmospheric constituents including water vapor, clouds, and aerosols are allowed to evolve according to the model physics. Both anthropogenic and open fire emissions came from the newly released CMIP6 datasets, which show strong regional trends in BC and SO2 emissions during the simulation time period. Results show that emissions from East Asia and South Asia together have the largest contributions to Arctic sulfate and BC concentrations in the upper troposphere, which have an increasing trend. The strong decrease in emissions from Europe, Russia and

  19. Dynamical behaviors of the shock compacton in the nonlinearly Schrödinger equation with a source term

    Science.gov (United States)

    Yin, Jiuli; Zhao, Liuwei

    2014-11-01

    In this paper, the dynamics from the shock compacton to chaos in the nonlinearly Schrödinger equation with a source term is investigated in detail. The existence of unclosed homoclinic orbits which are not connected with the saddle point indicates that the system has a discontinuous fiber solution which is a shock compacton. We prove that the shock compacton is a weak solution. The Melnikov technique is used to detect the conditions for the occurrence from the shock compacton to chaos and further analysis of the conditions for chaos suppression. The results show that the system turns to chaos easily under external disturbances. The critical parameter values for chaos appearing are obtained analytically and numerically using the Lyapunov exponents and the bifurcation diagrams.

  20. Methodology and tools for source term assessment in case of emergency - astrid project (EC 5th framework programme)

    International Nuclear Information System (INIS)

    Herviou, K.; Calmtorp, C.

    2003-01-01

    Full text: Following the reactor accident in Three Mile Island most western European countries with nuclear power reactors started a huge safety upgrade work to improve not only reactor safety but also the introduction of mitigating systems that would minimize consequences of a severe reactor accident. Further to this also emergency preparedness arrangements were made; alarm criteria, on-site and off-site emergency structures and plans to protect population and environment. Following also the Chernobyl accident in 1986, many countries started the development of dispersion codes to calculate and better predict the consequences of a radioactive release. However, follow-ups of the Chernobyl accident in the mid 90ties revealed the need for an earlier start of assessing the actual severity of an accident to efficiently succeed in emergency response actions. By looking into the NPP, at the state of f fission product barriers and critical safety systems, the magnitude of a potential radioactive release could be predicted in a timely manner to allow emergency response to be executed even before the occurrence of a release. This is the perspective in which the development of ASTRID methodology and tool should be regarded. By focussing more an similarities than differences the ASTRID methodology aims at a solution that could be acceptable for several reactor types as well as reactor containments, different stages of technical designs, but also for use at on-site as well as off-site emergency centres. So, the scope of the ASTRID methodology outlines the common FP barriers and interrelated critical safety functions, that is, plant process parameters of importance to address, for such an assessment of the likely future state of the failed NPP and the resulting source term. The methodology maps out relevant process parameters and indicators, what and how to calculate and a structured way to summarize and conclude on potential source term and likely time projections. In a way

  1. Influence of iodine chemistry on source term assessment; Influencia de la quimica del yodo en la estimacion del termino fuente

    Energy Technology Data Exchange (ETDEWEB)

    Herranz Puebla, L. E.; Lopez Diez, I.; Rodriguez Maroto, J. J.; Martinez Lopez-Alcorocho, A.

    1991-07-01

    The major goal of a phenomenology analysis of containment during a severe accident situation can be spitted into the following ones: to know the containment response to the different loads and to predict accurately the fission product and aerosol behavior. In this report, the main results coming from the study of a hypothetical accident scenario, based on LA-4 experiment of LACE project, are presented. In order to do it, several codes have been coupled: CONTEMPT4/MOD5 (thermohydraulics), NAUA/MOD5 (aerosol physics) and IODE (iodine chemistry). It has been demonstrated the impossibility of assessing with confidence the Source Term if the chemical conduct of some radionuclides is not taken into account. In particular, the influence on the iodine retention efficiency of the sump of variables such as pH has been proven. (Author)12 refs.

  2. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  3. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kress, T. S. [comp.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  4. A study on source term assessment and waste disposal requirement of decontamination and decommissioning for the TRIGA research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Whang, Joo Ho; Lee, Kyung JIn; Lee, Jae Min; Choi, Gyu Seup; Shin, Byoung Sun [Kyunghee Univ., Seoul (Korea, Republic of)

    1999-08-15

    The objective and necessity of the project : TRIGA is the first nuclear facility that decide to decommission and decontamination in our nation. As we estimate the expected life of nuclear power generation at 30 or 40 years, the decommissioning business should be conducted around 2010, and the development of regulatory technique supporting it should be developed previously. From a view of decommissioning and decontamination, the research reactor is just small in scale but it include all decommissioning and decontamination conditions. So, the rules by regulatory authority with decommissioning will be a guide for nuclear power plant in the future. The basis of regulatory technique required when decommissioning the research reactor are the radiological safety security and the data for it. The source term is very important condition not only for security of worker but for evaluating how we dispose the waste is appropriate for conducting the middle store and the procedure after it when the final disposal is considered. The content and the scope in this report contain the procedure of conducting the assessment of the source term which is most important in understanding the general concept of the decommissioning procedure of the decommissioning and decontamination of TRIGA research reactor. That is, the sampling and measuring method is presented as how to measure the volume of the radioactivity of the nuclear facilities. And also, the criterion of classifying the waste occurred in other countries and the site release criteria which is the final step of decommissioning and decontamination presented through MARSSIM. Finally, the program to be applicable through comparing the methods of our nation and other countries ones is presented as plan for disposal of the waste in the decommissioning.

  5. Long-term follow-up after accidental gamma irradiation from a {sup 192}Ir source in Bangladesh

    Energy Technology Data Exchange (ETDEWEB)

    Mollah, A.S.; Begum, A.; Begum, R. [Bangladesh Atomic Energy Commission, Dhaka (Bangladesh)

    2006-07-01

    A industrial radiographer was accidentally over -exposed to high dose of ionizing radiation from an {sup 192}Ir source pellet during radiograph y of weld-joints in gas pipe-lines on June 10, 1985 in Bangladesh. The source, housed in a portable exposure assembly, had an activity of about 1850 GBq. A guide -tube was used to control the transfer of the source from safe storage position to the exposure position and vice versa. For radiography, the ti p of the guide tube was to be fixed to the weld -joint while the source was cranked to the exposure position. Following the elapse of the preset exposure time the source had to be cranked back to the safe stor age position. This procedure was to be repeated for each radiographic exposure. Symptoms of high radiation exposure occurred immediately after the accident and skin erythema developed leading to progressive tissue deteriorations. Biological effects such as mild vomiting, malaise, nausea and diarrhea occurred within a short period after the accident. Skin erythema, swelling and tenderness of the palmar surfaces and the tips of the thumbs, index fingers and middle fingers of the both hands accompanied by severe pain and inflammation developed within 7 days of the mishap. The inflammatory changes characterized by redness and bullae spread over the affected fingers with severe pain and agony within a few days. The finger -tips developed abscesses with enormous pus formation and the affected finger nails fell off. He also developed toothache. At this stage a medical practitioner made some surgical dressings and prescribed antibiotics. During the first six months the most serious health disorder was local necroses of the skin and the deep layers of the palmar side of the affected fingers with sharply delineated injuries. The clinical findings were consistent with those reported elsewhere under similar accident conditions. The consequences of this over-exposure are being followed up to assess the long-term effects of

  6. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    International Nuclear Information System (INIS)

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO 2 matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO 2 matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO 2 matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO 2 matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs

  7. Network of anatomical texts (NAnaTex), an open-source project for visualizing the interaction between anatomical terms.

    Science.gov (United States)

    Momota, Ryusuke; Ohtsuka, Aiji

    2018-01-01

    Anatomy is the science and art of understanding the structure of the body and its components in relation to the functions of the whole-body system. Medicine is based on a deep understanding of anatomy, but quite a few introductory-level learners are overwhelmed by the sheer amount of anatomical terminology that must be understood, so they regard anatomy as a dull and dense subject. To help them learn anatomical terms in a more contextual way, we started a new open-source project, the Network of Anatomical Texts (NAnaTex), which visualizes relationships of body components by integrating text-based anatomical information using Cytoscape, a network visualization software platform. Here, we present a network of bones and muscles produced from literature descriptions. As this network is primarily text-based and does not require any programming knowledge, it is easy to implement new functions or provide extra information by making changes to the original text files. To facilitate collaborations, we deposited the source code files for the network into the GitHub repository ( https://github.com/ryusukemomota/nanatex ) so that anybody can participate in the evolution of the network and use it for their own non-profit purposes. This project should help not only introductory-level learners but also professional medical practitioners, who could use it as a quick reference.

  8. MAXWELL EQUATIONS FOR A GENERALISED LAGRANGIAN FUNCTIONAL ECUACIONES DE MAXWELL PARA UNA FUNCIONAL DE LAGRANGE GENERALIZADA

    Directory of Open Access Journals (Sweden)

    Héctor Torres-Silva

    2008-11-01

    Full Text Available This work deals with the problem of the construction of the Lagrange functional for an electromagnetic field. The generalised Maxwell equations for an electromagnetic field in free space are introduced. The main idea relies on the change of Lagrange function under the integral action. Usually, the Lagrange functional which describes the electromagnetic field is built with the quadrate of the electromagnetic field tensor . Such a quadrate term is the reason, from a mathematical point of view, for the linear form of the Maxwell equations in free space. The author does not make this assumption and nonlinear Maxwell equations are obtained. New material parameters of free space are established. The equations obtained are quite similar to the well-known Maxwell equations. The energy tensor of the electromagnetic field from a chiral approach to the Born Infeld Lagrangian is discussed in connection with the cosmological constant.Se aborda el problema de la construcción de la funcional de Lagrange de un campo electromagnético. Se introducen las ecuaciones generalizadas de Maxwell de un campo electromagnético en el espacio libre. La idea principal se basa en el cambio de función de Lagrange en virtud de la acción integral. Por lo general, la funcional de lagrange, que describe el campo electromagnético, se construye con el cuadrado del tensor de campo electromagnético. Ese término cuadrático es la razón, desde un punto de vista matemático, de la forma lineal de las ecuaciones de Maxwell en el espacio libre. Se obtienen las ecuaciones no lineales de Maxwell sin considerar esta suposición. Las ecuaciones de Maxwell obtenidas son bastante similares a las conocidas ecuaciones de Maxwell. Se analiza el tensor de energía del campo electromagnético en un enfoque quiral de la Lagrangiana de Born Infeld en relación con la constante cosmológica.

  9. Long-term chemical analysis and organic aerosol source apportionment at nine sites in central Europe: source identification and uncertainty assessment

    Directory of Open Access Journals (Sweden)

    K. R. Daellenbach

    2017-11-01

    Full Text Available Long-term monitoring of organic aerosol is important for epidemiological studies, validation of atmospheric models, and air quality management. In this study, we apply a recently developed filter-based offline methodology using an aerosol mass spectrometer (AMS to investigate the regional and seasonal differences of contributing organic aerosol sources. We present offline AMS measurements for particulate matter smaller than 10 µm at nine stations in central Europe with different exposure characteristics for the entire year of 2013 (819 samples. The focus of this study is a detailed source apportionment analysis (using positive matrix factorization, PMF including in-depth assessment of the related uncertainties. Primary organic aerosol (POA is separated in three components: hydrocarbon-like OA related to traffic emissions (HOA, cooking OA (COA, and biomass burning OA (BBOA. We observe enhanced production of secondary organic aerosol (SOA in summer, following the increase in biogenic emissions with temperature (summer oxygenated OA, SOOA. In addition, a SOA component was extracted that correlated with an anthropogenic secondary inorganic species that is dominant in winter (winter oxygenated OA, WOOA. A factor (sulfur-containing organic, SC-OA explaining sulfur-containing fragments (CH3SO2+, which has an event-driven temporal behaviour, was also identified. The relative yearly average factor contributions range from 4 to 14 % for HOA, from 3 to 11 % for COA, from 11 to 59 % for BBOA, from 5 to 23 % for SC-OA, from 14 to 27 % for WOOA, and from 15 to 38 % for SOOA. The uncertainty of the relative average factor contribution lies between 2 and 12 % of OA. At the sites north of the alpine crest, the sum of HOA, COA, and BBOA (POA contributes less to OA (POA / OA  =  0.3 than at the southern alpine valley sites (0.6. BBOA is the main contributor to POA with 87 % in alpine valleys and 42 % north of the alpine crest

  10. Long-term natural attenuation of carbon and nitrogen within a groundwater plume after removal of the treated wastewater source.

    Science.gov (United States)

    Repert, Deborah A; Barber, Larry B; Hess, Kathryn M; Keefe, Steffanie H; Kent, Douglas B; LeBlanc, Denis R; Smith, Richard L

    2006-02-15

    Disposal of treated wastewater for more than 60 years onto infiltration beds on Cape Cod, Massachusetts produced a groundwater contaminant plume greater than 6 km long in a surficial sand and gravel aquifer. In December 1995 the wastewater disposal ceased. A long-term, continuous study was conducted to characterize the post-cessation attenuation of the plume from the source to 0.6 km downgradient. Concentrations and total pools of mobile constituents, such as boron and nitrate, steadily decreased within 1-4 years along the transect. Dissolved organic carbon loads also decreased, but to a lesser extent, particularly downgradient of the infiltration beds. After 4 years, concentrations and pools of carbon and nitrogen in groundwater were relatively constant with time and distance, but substantially elevated above background. The contaminant plume core remained anoxic for the entire 10-year study period; temporal patterns of integrated oxygen deficit decreased slowly at all sites. In 2004, substantial amounts of total dissolved carbon (7 mol C m(-2)) and fixed (dissolved plus sorbed) inorganic nitrogen (0.5 mol N m(-2)) were still present in a 28-m vertical interval at the disposal site. Sorbed constituents have contributed substantially to the dissolved carbon and nitrogen pools and are responsible for the long-term persistence of the contaminant plume. Natural aquifer restoration at the discharge location will take at least several decades, even though groundwater flow rates and the potential for contaminant flushing are relatively high.

  11. A long term source apportionment study of wood burning and traffic aerosols for three measurement sites in Switzerland

    Science.gov (United States)

    Herich, Hanna; Hüglin, Christoph; Buchmann, Brigitte

    2010-05-01

    Besides their effects on radiative forcing soot aerosols have been found to cause health effects as they are carcinogenic. Diesel engines and incomplete biomass burning are the major emission sources of soot particles. Especially during winter, the wood burning (WB) emissions from residential heating have been found to contribute significantly to the total carbonaceous material (CM). To investigate the contribution of fossil fuel (FF) and WB emissions seven-wavelength aethalometers have been deployed in previous studies (Sandradewi et al. 2008, Favez et al. 2009). In these studies, the stronger light absorption of WB aerosols in the blue and ultraviolet compared to the light absorption of aerosols from FF combustion was used. Linear regression modelling of CM against the light absorption coefficient of FF combustion aerosols in the infrared (950 nm) and the light absorption coefficient of WB aerosols in the blue (470 nm) was proposed for source apportionment. In this study we present long term aethalometer measurements at two rural and one urban background measurement stations in Switzerland from 2008 - 2010. At these stations organic (OC) and elemental carbon (EC) were also measured by thermochemical analysis providing estimates for total CM. Above described linear regession modelling was applied for determination of the contribution of FF and WB emissions to total CM. Sensitivity tests for different regression models and for varying light absorption exponents were performed. It was found that the regression modelling approach is only limited suitable for long term datasets because of significant fractions of CM resulting from sources and processes other than FF and WB. Thus in a different approach we focused on black carbon (BC). The contribution of WB and FF to BC was directly determined from the absorption coefficients of FF and WB aerosols which were calculated with the use of absorption exponents taken from literature. First results show that in winter the

  12. Accident Source Terms for Pressurized Water Reactors with High-Burnup Cores Calculated using MELCOR 1.8.5.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Goldmann, Andrew; Kalinich, Donald A.; Powers, Dana A.

    2016-12-01

    In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in this study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs 2 MoO 4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU

  13. Generalisation benefits of output gating in a model of prefrontal cortex

    Science.gov (United States)

    Kriete, Trent; Noelle, David C.

    2011-06-01

    The prefrontal cortex (PFC) plays a central role in flexible cognitive control, including the suppression of habitual responding in favour of situation-appropriate behaviours that can be quite novel. PFC provides a kind of working memory, maintaining the rules, goals, and/or actions that are to control behaviour in the current context. For flexible control, these PFC representations must be sufficiently componential to support systematic generalisation to novel situations. The anatomical structure of PFC can be seen as implementing a componential 'slot-filler' structure, with different components encoded over isolated pools of neurons. Previous PFC models have highlighted the importance of a dynamic gating mechanism to selectively update individual 'slot' contents. In this article, we present simulation results that suggest that systematic generalisation also requires an 'output gating' mechanism that limits the influence of PFC on more posterior brain areas to reflect a small number of representational components at any one time.

  14. The use of oral fluralaner for the treatment of feline generalised demodicosis: a case report.

    Science.gov (United States)

    Matricoti, I; Maina, E

    2017-08-01

    There is little agreement on the most effective and safest treatment for feline demodicosis. Protocols generally consist of long-lasting therapy courses based on rinses, subcutaneous injections, oral drug administration or repeated spot-on formulation and the efficacy of most of these is poorly documented. Many of these products have also been associated with adverse effects and may be difficult to administer in cats, leading to poor owner compliance and treatment failure. This case report describes the successful use of fluralaner in treating a generalised form of demodicosis caused by Demodex cati in an adult cat that was probably triggered by chronic glucocorticoid administration. After a single oral dose of 28 mg/kg fluralaner, negative skin scrapings were obtained within one month and clinical cure within two months. No side effects were observed. Larger studies are needed to evaluate the efficacy of fluralaner in treating feline generalised demodicosis. © 2017 British Small Animal Veterinary Association.

  15. A Note on the Properties of Generalised Separable Spatial Autoregressive Process

    Directory of Open Access Journals (Sweden)

    Mahendran Shitan

    2009-01-01

    Full Text Available Spatial modelling has its applications in many fields like geology, agriculture, meteorology, geography, and so forth. In time series a class of models known as Generalised Autoregressive (GAR has been introduced by Peiris (2003 that includes an index parameter δ. It has been shown that the inclusion of this additional parameter aids in modelling and forecasting many real data sets. This paper studies the properties of a new class of spatial autoregressive process of order 1 with an index. We will call this a Generalised Separable Spatial Autoregressive (GENSSAR Model. The spectral density function (SDF, the autocovariance function (ACVF, and the autocorrelation function (ACF are derived. The theoretical ACF and SDF plots are presented as three-dimensional figures.

  16. Knee function in 10-year-old children and adults with Generalised Joint Hypermobility

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Hansen, Henrik; Simonsen, Erik B

    2012-01-01

    PURPOSE: Knee function is reduced in patients with Benign Joint Hypermobility Syndrome. The aim was to study knee function in children and adults with Generalised Joint Hypermobility (GJH) and Non-GJH (NGJH)). MATERIALS AND METHODS: In a matched comparative study, 39 children and 36 adults (mean ...... age children 10.2years; adults 40.3years) were included, comprising 19 children and 18 adults with GJH (Beighton =5/9; Beighton =4/9), minimum one hypermobile knee, no knee pain (children), and 20 children and 18 adults with NGJH (Beighton......PURPOSE: Knee function is reduced in patients with Benign Joint Hypermobility Syndrome. The aim was to study knee function in children and adults with Generalised Joint Hypermobility (GJH) and Non-GJH (NGJH)). MATERIALS AND METHODS: In a matched comparative study, 39 children and 36 adults (mean...

  17. A generalised Dynamic Overflow Risk Assessment (DORA) for Real Time Control of urban drainage systems

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Grum, Morten

    2014-01-01

    An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drai......An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored...... and their uncertainty contributed to further improving the performance of drainage systems. The results of this paper will contribute to the wider usage of global RTC methods in the management of urban drainage networks....

  18. A One-Class Classification Approach to Generalised Speaker Verification Spoofing Countermeasures using Local Binary Patterns

    OpenAIRE

    Alegre, Federico; Amehraye, Asmaa; Evans, Nicholas

    2013-01-01

    International audience; The vulnerability of automatic speaker verification systems to spoofing is now well accepted. While recent work has shown the potential to develop countermeasures capable of detecting spoofed speech signals, existing solutions typically function well only for specific attacks on which they are optimised. Since the exact nature of spoofing attacks can never be known in practice, there is thus a need for generalised countermeasures which can detect previously unseen spoo...

  19. A Baecklund transformation and the inverse scattering transform method for the generalised Vakhnenko equation

    CERN Document Server

    Vakhnenko, V O; Morrison, A J

    2003-01-01

    A Baecklund transformation both in bilinear and in ordinary form for the transformed generalised Vakhnenko equation (GVE) is derived. It is shown that the equation has an infinite sequence of conservation laws. An inverse scattering problem is formulated; it has a third-order eigenvalue problem. A procedure for finding the exact N-soliton solution to the GVE via the inverse scattering method is described. The procedure is illustrated by considering the cases N=1 and 2.

  20. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    International Nuclear Information System (INIS)

    Gmeiner, F.

    2006-01-01

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  1. A generalised Dynamic Overflow Risk Assessment (DORA) for Real Time Control of urban drainage systems

    OpenAIRE

    Vezzaro, Luca; Grum, Morten

    2014-01-01

    An innovative and generalised approach to the integrated Real Time Control of urban drainage systems is presented. The Dynamic Overflow Risk Assessment (DORA) strategy aims to minimise the expected Combined Sewer Overflow (CSO) risk by considering (i) the water volume presently stored in the drainage network, (ii) the expected runoff volume (calculated by radar-based nowcast models) and – most important – (iii) the estimated uncertainty of the runoff forecasts. The inclusion of uncertainty al...

  2. Aspects of string theory compactifications. D-brane statistics and generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Gmeiner, F.

    2006-05-26

    In this thesis we investigate two different aspects of string theory compactifications. The first part deals with the issue of the huge amount of possible string vacua, known as the landscape. Concretely we investigate a specific well defined subset of type II orientifold compactifications. We develop the necessary tools to construct a very large set of consistent models and investigate their gauge sector on a statistical basis. In particular we analyse the frequency distributions of gauge groups and the possible amount of chiral matter for compactifications to six and four dimensions. In the phenomenologically relevant case of four-dimensional compactifications, special attention is paid to solutions with gauge groups that include those of the standard model, as well as Pati-Salam, SU(5) and flipped SU(5) models. Additionally we investigate the frequency distribution of coupling constants and correlations between the observables in the gauge sector. These results are compared with a recent study of Gepner models. Moreover, we elaborate on questions concerning the finiteness of the number of solutions and the computational complexity of the algorithm. In the second part of this thesis we consider a new mathematical framework, called generalised geometry, to describe the six-manifolds used in string theory compactifications. In particular, the formulation of T-duality and mirror symmetry for nonlinear topological sigma models is investigated. Therefore we provide a reformulation and extension of the known topological A- and B-models to the generalised framework. The action of mirror symmetry on topological D-branes in this setup is presented and the transformation of the boundary conditions is analysed. To extend the considerations to D-branes in type II string theory, we introduce the notion of generalised calibrations. We show that the known calibration conditions of supersymmetric branes in type IIA and IIB can be obtained as special cases. Finally we investigate

  3. The linear stability of the Schwarzschild solution to gravitational perturbations in the generalised wave gauge

    OpenAIRE

    Johnson, Thomas

    2018-01-01

    In a recent seminal paper \\cite{D--H--R} of Dafermos, Holzegel and Rodnianski the linear stability of the Schwarzschild family of black hole solutions to the Einstein vacuum equations was established by imposing a double null gauge. In this paper we shall prove that the Schwarzschild family is linearly stable as solutions to the Einstein vacuum equations by imposing instead a generalised wave gauge: all sufficiently regular solutions to the system of equations that result from linearising the...

  4. Effect of lamotrigine on cerebral blood flow in patients with idiopathic generalised epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Eun Yeon [Ewha Womans University, Department of Neurology, College of Medicine, Seoul (Korea); Hong, Seung Bong; Tae, Woo Suk; Han, Sun Jung; Seo, Dae Won [Sungkyunkwan University School of Medicine, Department of Neurology, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Kyung-Han [Sungkyunkwan University School of Medicine, Department of Nuclear Medicine, Samsung Medical Center and Center for Clinical Medicine, SBRI, Gangnam-Gu, Seoul (Korea); Lee, Mann Hyung [Catholic University of Daegu, College of Pharmacy, Gyeongbuk (Korea)

    2006-06-15

    The purpose of this study was to investigate the effects of the new anti-epileptic drug, lamotrigine, on cerebral blood flow by performing {sup 99m}Tc-ethylcysteinate dimer (ECD) single-photon emission computed tomography (SPECT) before and after medication in patients with drug-naive idiopathic generalised epilepsy. Interictal {sup 99m}Tc-ECD brain SPECT was performed before drug treatment started and then repeated after lamotrigine medication for 4-5 months in 30 patients with generalised epilepsy (M/F=14/16, 19.3{+-}3.4 years). Seizure types were generalised tonic-clonic seizure in 23 patients and myoclonic seizures in seven. The mean lamotrigine dose used was 214.1{+-}29.1 mg/day. For SPM analysis, all SPECT images were spatially normalised to the standard SPECT template and then smoothed using a 12-mm full-width at half-maximum Gaussian kernel. The paired t test was used to compare pre- and post-lamotrigine SPECT images. SPM analysis of pre- and post-lamotrigine brain SPECT images showed decreased perfusion in bilateral dorsomedial nuclei of thalami, bilateral uncus, right amygdala, left subcallosal gyrus, right superior and inferior frontal gyri, right precentral gyrus, bilateral superior and inferior temporal gyri and brainstem (pons, medulla) after lamotrigine medication at a false discovery rate-corrected p<0.05. No brain region showed increased perfusion after lamotrigine administration. (orig.)

  5. Recent Fuzzy Generalisations of Rough Sets Theory: A Systematic Review and Methodological Critique of the Literature

    Directory of Open Access Journals (Sweden)

    Abbas Mardani

    2017-01-01

    Full Text Available Rough set theory has been used extensively in fields of complexity, cognitive sciences, and artificial intelligence, especially in numerous fields such as expert systems, knowledge discovery, information system, inductive reasoning, intelligent systems, data mining, pattern recognition, decision-making, and machine learning. Rough sets models, which have been recently proposed, are developed applying the different fuzzy generalisations. Currently, there is not a systematic literature review and classification of these new generalisations about rough set models. Therefore, in this review study, the attempt is made to provide a comprehensive systematic review of methodologies and applications of recent generalisations discussed in the area of fuzzy-rough set theory. On this subject, the Web of Science database has been chosen to select the relevant papers. Accordingly, the systematic and meta-analysis approach, which is called “PRISMA,” has been proposed and the selected articles were classified based on the author and year of publication, author nationalities, application field, type of study, study category, study contribution, and journal in which the articles have appeared. Based on the results of this review, we found that there are many challenging issues related to the different application area of fuzzy-rough set theory which can motivate future research studies.

  6. Bi-orthogonality relations for fluid-filled elastic cylindrical shells: Theory, generalisations and application to construct tailored Green's matrices

    Science.gov (United States)

    Ledet, Lasse S.; Sorokin, Sergey V.

    2018-03-01

    The paper addresses the classical problem of time-harmonic forced vibrations of a fluid-filled cylindrical shell considered as a multi-modal waveguide carrying infinitely many waves. The forced vibration problem is solved using tailored Green's matrices formulated in terms of eigenfunction expansions. The formulation of Green's matrix is based on special (bi-)orthogonality relations between the eigenfunctions, which are derived here for the fluid-filled shell. Further, the relations are generalised to any multi-modal symmetric waveguide. Using the orthogonality relations the transcendental equation system is converted into algebraic modal equations that can be solved analytically. Upon formulation of Green's matrices the solution space is studied in terms of completeness and convergence (uniformity and rate). Special features and findings exposed only through this modal decomposition method are elaborated and the physical interpretation of the bi-orthogonality relation is discussed in relation to the total energy flow which leads to derivation of simplified equations for the energy flow components.

  7. A Generalised Assessment of Working Fluids and Radial Turbines for Non-Recuperated Subcritical Organic Rankine Cycles

    Directory of Open Access Journals (Sweden)

    Martin T. White

    2018-03-01

    Full Text Available The aim of this paper is to conduct a generalised assessment of both optimal working fluids and radial turbine designs for small-scale organic Rankine cycle (ORC systems across a range of heat-source temperatures. The former has been achieved by coupling a thermodynamic model of subcritical, non-recperated cycles with the Peng–Robinson equation of state, and optimising the working-fluid and cycle parameters for heat-source temperatures ranging between 80 ° C and 360 ° C . The critical temperature of the working fluid is found to be an important parameter governing working-fluid selection. Moreover, a linear correlation between heat-source temperature and the optimal critical temperature that achieves maximum power output has been found for heat-source temperatures below 300 ° C ( T cr = 0.830 T hi + 41.27 . This correlation has been validated against cycle calculations completed for nine predefined working fluids using both the Peng–Robinson equation of state and using the REFPROP program. Ultimately, this simple correlation can be used to identify working-fluid candidates for a specific heat-source temperature. In the second half of this paper, the effect of the heat-source temperature on the optimal design of a radial-inflow turbine rotor for a 25 kW subcritical ORC system has been studied. As the heat-source temperature increases, the optimal blade-loading coefficient increases, whilst the optimal flow coefficient reduces. Furthermore, passage losses are dominant in turbines intended for low-temperature applications. However, at higher heat-source temperatures, clearance losses become more dominant owing to the reduced blade heights. This information can be used to identify the most direct route to efficiency improvements in these machines. Finally, it is observed that the transition from a conventional converging stator to a converging-diverging stator occurs at heat-source temperatures of approximately 165 ° C , whilst radially

  8. Long-term effects of total and source-specific particulate air pollution on incident cardiovascular disease in Gothenburg, Sweden.

    Science.gov (United States)

    Stockfelt, Leo; Andersson, Eva M; Molnár, Peter; Gidhagen, Lars; Segersson, David; Rosengren, Annika; Barregard, Lars; Sallsten, Gerd

    2017-10-01

    Long-term exposure to air pollution increases cardiopulmonary morbidity and mortality, but it is not clear which components of air pollution are the most harmful, nor which time window of exposure is most relevant. Further studies at low exposure levels have also been called for. We analyzed two Swedish cohorts to investigate the effects of total and source-specific particulate matter (PM) on incident cardiovascular disease for different time windows of exposure. Two cohorts initially recruited to study predictors of cardiovascular disease (the PPS cohort and the GOT-MONICA cohort) were followed from 1990 to 2011. We collected data on residential addresses and assigned each individual yearly total and source-specific PM and Nitrogen Oxides (NO x ) exposures based on dispersion models. Using multivariable Cox regression models with time-dependent exposure, we studied the association between three different time windows (lag 0, lag 1-5, and exposure at study start) of residential PM and NO x exposure, and incidence of ischemic heart disease, stroke, heart failure and atrial fibrillation. During the study period, there were 2266 new-onset cases of ischemic heart disease, 1391 of stroke, 925 of heart failure and 1712 of atrial fibrillation. The majority of cases were in the PPS cohort, where participants were older. Exposure levels during the study period were moderate (median: 13µg/m 3 for PM 10 and 9µg/m 3 for PM 2.5 ), and similar in both cohorts. Road traffic and residential heating were the largest local sources of PM air pollution, and long distance transportation the largest PM source in total. In the PPS cohort, there were positive associations between PM in the last five years and both ischemic heart disease (HR: 1.24 [95% CI: 0.98-1.59] per 10µg/m 3 of PM 10 , and HR: 1.38 [95% CI: 1.08-1.77] per 5µg/m 3 of PM 2.5 ) and heart failure. In the GOT-MONICA cohort, there were positive but generally non-significant associations between PM and stroke (HR: 1

  9. Source term estimation and the isotopic ratio of radioactive material released from the WIPP repository in New Mexico, USA

    International Nuclear Information System (INIS)

    Thakur, P.

    2016-01-01

    After almost 15 years of operations, the Waste Isolation Pilot Plant (WIPP) had one of its waste drums breach underground as a result of a runaway chemical reaction in the waste it contained. This incident occurred on February 14, 2014. Moderate levels of radioactivity were released into the underground air. A small portion of the contaminated underground air also escaped to the surface through the ventilation system and was detected approximately 1 km away from the facility. According to the source term estimation, the actual amount of radioactivity released from the WIPP site was less than 1.5 mCi. The highest activity detected on the surface was 115.2 μBq/m 3 for 241 Am and 10.2 μBq/m 3 for 239+240 Pu at a sampling station located 91 m away from the underground air exhaust point and 81.4 μBq/m 3 of 241 Am and 5.8 μBq/m 3 of 239+240 Pu at a monitoring station located approximately 1 km northwest of the WIPP facility. The dominant radionuclides released were americium and plutonium, in a ratio that matches the content of the breached drum. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment. In this paper, the early stage monitoring data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and an oversight monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP) LLC were utilized to estimate the actual amount of radioactivity released from the WIPP underground. The Am and Pu isotope ratios were measured and used to support the hypothesis that the release came from one drum identified as having breached that represents a specific waste stream with this radionuclide ratio in its inventory. This failed drum underwent a heat and gas producing reaction that overpowered its vent and

  10. Determination of the in-containment source term for a Large-Break Loss of Coolant Accident

    International Nuclear Information System (INIS)

    2001-04-01

    This is the report of a project that focused on one of the most important design basis accidents: the Large Break Loss Of Coolant Accident (LBLOCA) (for pressurised water reactors). The first step in the calculation of the radiological consequences of this accident is the determination of the source term inside the containment. This work deals with this part of the calculation of the LBLOCA radiological consequences for which a previous benchmark (1988) has shown wide variations in the licensing practices adopted by European countries. The calculation of this source term may naturally be split in several steps (see chapter II), corresponding to several physical stages in the release of fission products: fraction of core failure, release from the damaged fuel, airborne part of the release and the release into the reactor coolant system and the sumps, chemical behaviour of iodine in the aqueous and gas phases, natural and spray removal in the containment atmosphere. A chapter is devoted to each of these topics. In addition, two other chapters deal with the basic assumptions to define the accidental sequence and the nuclides to be considered when computing doses associated with the LBLOCA. The report describes where there is agreement between the partner organisations and where there are still differences in approach. For example, there is agreement concerning the percentage of failed fuel which could be used in future licensing assessments (however this subject is still under discussion in France, a lower value is thinkable). For existing plants, AVN (Belgium) wishes to keep the initial licensing assumptions. For the release from damaged fuel, there is not complete agreement: AVN (Belgium) wishes to maintain its present approach. IPSN (France), GRS (Germany) and NNC (UK) prefer to use their own methodologies that result in slightly different values to the proposed values for a common position. There are presently no recommendations of the release of fuel particulates

  11. Assessment of nuclear energy cost competitiveness against alternative energy sources in Romania envisaging the long-term national energy sustainability

    International Nuclear Information System (INIS)

    Margeanu, C. A.

    2016-01-01

    The paper includes some of the results obtained by RATEN ICN Pitesti experts in the IAEA.s Collaborative Project INPRO-SYNERGIES. The case study proposed to evaluate and analyze the nuclear capacity development and increasing of its share in the national energy sector, envisaging the long term national and regional energy sustainability by keeping collaboration options open for the future while bringing solutions to short/medium-term challenges. The following technologies, considered as future competing technologies for electric energy generation in Romania, were selected: nuclear technology (represented by PHWR CANDU Units 3 and 4 - CANDU new, advanced HWR - Adv. HWR, and advanced PWR - Adv. PWR) and, as alternative energy sources, classical technology (represented by Coal-fired power plant using lignite fossil fuel, with carbon capture - Coal n ew, and Gas-fired power plant operating on combined cycle, with carbon capture - Gas n ew). The study included assessment of specific economic indicators, sensitivity analyses being performed on Levelised Unit Energy Cost (LUEC) variation due to different perturbations (e.g. discount rate, overnight costs, etc). Robustness indices (RI) of LUEC were also calculated by considering simultaneous variation of input parameters for the considered power plants. The economic analyses have been performed by using the IAEA.s NEST program. The study results confirmed that in Romania, under the national specific conditions defined, electricity produced by nuclear power plants is cost competitive against coal and gas fired power plants electricity. The highest impact of considered perturbations on LUEC has been observed for capital intensive technologies (nuclear technologies) comparatively with the classic power plants, especially for discount rate changes. (authors)

  12. Cost of presumptive source term Remedial Actions Laboratory for energy-related health research, University of California, Davis

    International Nuclear Information System (INIS)

    Last, G.V.; Bagaasen, L.M.; Josephson, G.B.; Lanigan, D.C.; Liikala, T.L.; Newcomer, D.R.; Pearson, A.W.; Teel, S.S.

    1995-12-01

    A Remedial Investigation/Feasibility Study (RI/FS) is in progress at the Laboratory for Energy Related Health Research (LEHR) at the University of California, Davis. The purpose of the RI/FS is to gather sufficient information to support an informed risk management decision regarding the most appropriate remedial actions for impacted areas of the facility. In an effort to expedite remediation of the LEHR facility, the remedial project managers requested a more detailed evaluation of a selected set of remedial actions. In particular, they requested information on both characterization and remedial action costs. The US Department of Energy -- Oakland Office requested the assistance of the Pacific Northwest National Laboratory to prepare order-of-magnitude cost estimates for presumptive remedial actions being considered for the five source term operable units. The cost estimates presented in this report include characterization costs, capital costs, and annual operation and maintenance (O ampersand M) costs. These cost estimates are intended to aid planning and direction of future environmental remediation efforts

  13. Development of a comprehensive source term model for the Subsurface Disposal Area at the Idaho National Engineering and Environmental Laboratory

    International Nuclear Information System (INIS)

    1997-01-01

    The first detailed comprehensive simulation study to evaluate fate and transport of wastes disposed in the Subsurface Disposal Area (SDA), at the Radioactive Waste Management Complex (RWMC), Idaho National Engineering and Environmental Laboratory (INEEL) has recently been conducted. One of the most crucial parts of this modeling was the source term or release model. The current study used information collected over the last five years defining contaminant specific information including: the amount disposed, the waste form (physical and chemical properties) and the type of container used for each contaminant disposed. This information was used to simulate the release of contaminants disposed in the shallow subsurface at the SDA. The DUST-MS model was used to simulate the release. Modifications were made to allow the yearly disposal information to be incorporated. The modeling includes unique container and release rate information for each of the 42 years of disposal. The results from this simulation effort are used for both a groundwater and a biotic uptake evaluation. As part of this modeling exercise, inadequacies in the available data relating to the release of contaminants have been identified. The results from this modeling study have been used to guide additional data collection activities at the SDA for purposes of increasing confidence in the appropriateness of model predictions

  14. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    International Nuclear Information System (INIS)

    Edgar, S Brian; Ramos, M P Machado

    2007-01-01

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,

  15. Using the generalised invariant formalism: a class of conformally flat pure radiation metrics with a negative cosmological constant

    Energy Technology Data Exchange (ETDEWEB)

    Edgar, S Brian [Department of Mathematics, Linkoepings Universitet Linkoeping, S-581 83 (Sweden); Ramos, M P Machado [Departamento de Matematica para a Ciencia e Tecnologia, Azurem 4800-058 Guimaraes, Universidade do Minho (Portugal)

    2007-05-15

    We demonstrate an integration procedure for the generalised invariant formalism by obtaining a subclass of conformally flat pure radiation spacetimes with a negative cosmological constant. The method used is a development of the methods used earlier for pure radiation spacetimes of Petrov types O and N respectively. This subclass of spacetimes turns out to have one degree of isotropy freedom, so in this paper we have extended the integration procedure for the generalised invariant formalism to spacetimes with isotropy freedom,.

  16. Evaluation of the Non-Transient Hydrologic Source Term from the CAMBRIC Underground Nuclear Test in Frenchman Flat, Nevada Test Site

    International Nuclear Information System (INIS)

    Tompson, A B; Maxwell, R M; Carle, S F; Zavarin, M; Pawloski, G A.; Shumaker, D E

    2005-01-01

    Hydrologic Source Term (HST) calculations completed in 1998 at the CAMBRIC underground nuclear test site were LLNL's first attempt to simulate a hydrologic source term at the NTS by linking groundwater flow and transport modeling with geochemical modeling (Tompson et al., 1999). Significant effort was applied to develop a framework that modeled in detail the flow regime and captured all appropriate chemical processes that occurred over time. However, portions of the calculations were simplified because of data limitations and a perceived need for generalization of the results. For example: (1) Transient effects arising from a 16 years of pumping at the site for a radionuclide migration study were not incorporated. (2) Radionuclide fluxes across the water table, as derived from infiltration from a ditch to which pumping effluent was discharged, were not addressed. (3) Hydrothermal effects arising from residual heat of the test were not considered. (4) Background data on the ambient groundwater flow direction were uncertain and not represented. (5) Unclassified information on the Radiologic Source Term (RST) inventory, as tabulated recently by Bowen et al. (2001), was unavailable; instead, only a limited set of derived data were available (see Tompson et al., 1999). (6) Only a small number of radionuclides and geochemical reactions were incorporated in the work. (7) Data and interpretation of the RNM-2S multiple well aquifer test (MWAT) were not available. As a result, the current Transient CAMBRIC Hydrologic Source Term project was initiated as part of a broader Phase 2 Frenchman Flat CAU flow and transport modeling effort. The source term will be calculated under two scenarios: (1) A more specific representation of the transient flow and radionuclide release behavior at the site, reflecting the influence of the background hydraulic gradient, residual test heat, pumping experiment, and ditch recharge, and taking into account improved data sources and modeling

  17. Numerical simulation of flow induced by a pitched blade turbine. Comparison of the sliding mesh technique and an averaged source term method

    Energy Technology Data Exchange (ETDEWEB)

    Majander, E.O.J.; Manninen, M.T. [VTT Energy, Espoo (Finland)

    1996-12-31

    The flow induced by a pitched blade turbine was simulated using the sliding mesh technique. The detailed geometry of the turbine was modelled in a computational mesh rotating with the turbine and the geometry of the reactor including baffles was modelled in a stationary co-ordinate system. Effects of grid density were investigated. Turbulence was modelled by using the standard k-{epsilon} model. Results were compared to experimental observations. Velocity components were found to be in good agreement with the measured values throughout the tank. Averaged source terms were calculated from the sliding mesh simulations in order to investigate the reliability of the source term approach. The flow field in the tank was then simulated in a simple grid using these source terms. Agreement with the results of the sliding mesh simulations was good. Commercial CFD-code FLUENT was used in all simulations. (author)

  18. Source term assessment, containment atmosphere control systems, and accident consequences. Report to CSNI by an OECD/NEA Group of experts

    International Nuclear Information System (INIS)

    1987-04-01

    CSNI Report 135 summarizes the results of the work performed by CSNI's Principal Working Group No. 4 on the Source Term and Environmental Consequences (PWG4) during the period extending from 1983 to 1986. This document contains the latest information on some important topics relating to source terms, accident consequence assessment, and containment atmospheric control systems. It consists of five parts: (1) a Foreword and Executive Summary prepared by PWG4's Chairman; (2) a Report on the Technical Status of the Source Term; (3) a Report on the Technical Status of Filtration and Containment Atmosphere Control Systems for Nuclear Reactors in the Event of a Severe Accident; (4) a Report on the Technical Status of Reactor Accident Consequence Assessment; (5) a list of members of PWG4

  19. A study on Prediction of Radioactive Source-term from the Decommissioning of Domestic NPPs by using CRUDTRAN Code

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jong Soon; Lee, Sang Heon; Cho, Hoon Jo [Department of Nuclear Engineering Chosun University, Gwangju (Korea, Republic of)

    2016-10-15

    For the study, the behavior mechanism of corrosion products in the primary system of the Kori no.1 was analyzed, and the volume of activated corrosion products in the primary system was assessed based on domestic plant data with the CRUDTRAN code used to predict the volume. It is expected that the study would be utilized in predicting radiation exposure of workers performing maintenance and repairs in high radiation areas and in selecting the process of decontaminations and decommissioning in the primary system. It is also expected that in the future it would be used as the baseline data to estimate the volume of radioactive wastes when decommissioning a nuclear plant in the future, which would be an important criterion in setting the level of radioactive wastes used to compute the quantity of radioactive wastes. The results of prediction of the radioactive nuclide inventory in the primary system performed in this study would be used as baseline data for the estimation of the volume of radioactive wastes when decommissioning NPPs in the future. It is also expected that the data would be important criteria used to classify the level of radioactive wastes to calculate the volume. In addition, it is expected that the data would be utilized in reducing radiation exposure of workers in charge of system maintenance and repairing in high radiation zones and also predicting the selection of decontaminations and decommissioning processes in the primary systems. In future researches, it is planned to conduct the source term assessment against other NPP types such as CANDU and OPR-1000, in addition to the Westinghouse type nuclear plants.

  20. Alternate source term models for Yucca Mountain performance assessment based on natural analog data and secondary mineral solubility

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, W.M.; Codell, R.B.

    1999-07-01

    Performance assessment calculations for the proposed high level radioactive waste repository at Yucca Mountain, Nevada, were conducted using the Nuclear Regulatory Commission Total-System Performance Assessment (TPA 3.2) code to test conceptual models and parameter values for the source term based on data from the Pena Blanca, Mexico, natural analog site and based on a model for coprecipitation and solubility of secondary schoepite. In previous studies the value for the maximum constant oxidative alteration rate of uraninite at the Nopal I uranium body at Pena Blanca was estimated. Scaling this rate to the mass of uranium for the proposed Yucca Mountain repository yields an oxidative alteration rate of 22 kg/y, which was assumed to be an upper limit on the release rate from the proposed repository. A second model was developed assuming releases of radionuclides are based on the solubility of secondary schoepite as a function of temperature and solution chemistry. Releases of uranium are given by the product of uranium concentrations at equilibrium with schoepite and the flow of water through the waste packages. For both models, radionuclides other than uranium and those in the cladding and gap fraction were modeled to be released at a rate proportional to the uranium release rate, with additional elemental solubility limits applied. Performance assessment results using the Pena Blanca oxidation rate and schoepite solubility models for Yucca Mountain were compared to the TPA 3.2 base case model, in which release was based on laboratory studies of spent fuel dissolution, cladding and gap release, and solubility limits. Doses calculated using the release rate based on natural analog data and the schoepite solubility models were smaller than doses generated using the base case model. These results provide a degree of confidence in safety predictions using the base case model and an indication of how conservatism in the base case model may be reduced in future analyses.

  1. Development of a generalised equivalent estimation approach for multi-axle vehicle handling dynamics

    Science.gov (United States)

    Ding, Jinquan; Guo, Konghui

    2016-01-01

    This paper devotes analytical effort in developing the 2M equivalent approach to analyse both the effect of vehicle body roll and n-axle handling on vehicle dynamics. The 1M equivalent vehicle 2DOF equation including an equivalent roll effect was derived from the conventional two-axle 3DOF vehicle model. And the 1M equivalent dynamics concepts were calculated to evaluate the steady-state steering, frequency characteristics, and root locus of the two-axle vehicle with only the effect of body roll. This 1M equivalent approach is extended to a three-axle 3DOF model to derive similar 1M equivalent mathematical identities including an equivalent roll effect. The 1M equivalent wheelbases and stability factor with the effect of the third axle or body roll, and 2M equivalent wheelbase and stability factor including both the effect of body roll and the third-axle handling were derived to evaluate the steady-state steering, frequency characteristics, and root locus of the three-axle vehicle. By using the recursive method, the generalised 1M equivalent wheelbase and stability factor with the effect of n-axle handling and 2M equivalent generalised wheelbase and stability factor including both the effect of body roll and n-axle handling were derived to evaluate the steady-state steering, frequency characteristics, and root locus of the n-axle vehicle. The 2M equivalent approach and developed generalised mathematical handling concepts were validated to be useful and could serve as an important tool for estimating both the effect of vehicle body roll and n-axle handling on multi-axle vehicle dynamics.

  2. Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis

    OpenAIRE

    Beugnet, Fr?d?ric; Halos, L?na?g; Larsen, Diane; de Vos, Christa

    2016-01-01

    The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial) administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer). Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg) on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same int...

  3. Generalised morphoea with lichen sclerosus et atrophicus and unusual bone changes

    Directory of Open Access Journals (Sweden)

    Prasad P

    1995-01-01

    Full Text Available A 26-year-old male patient presented with multiple plaques on the limbs and trunk suggestive of morphoea. He also exhibited multiple, small, atrophic, hypopigmented macules on the left side of the trunk, the histopathology of which was consistent with lichen sclerosus et atrophicus (LSA. The patient developed large ulcers on the left leg and foot, and contractures with flexion deformity of the left ring and little fingers. This combination of generalised morphoea with LSA and unusual osteolytic bone changes is uncommon.

  4. Yangian and SUSY symmetry of high spin parton splitting amplitudes in generalised Yang-Mills theory

    Science.gov (United States)

    Kirschner, Roland; Savvidy, George

    2017-07-01

    We have calculated the high spin parton splitting amplitudes postulating the Yangian symmetry of the scattering amplitudes for tensor gluons. The resulting splitting amplitudes coincide with the earlier calculations, which were based on the BCFW recursion relations. The resulting formula unifies all known splitting probabilities found earlier in gauge field theories. It describes splitting probabilities for integer and half-integer spin particles. We also checked that the splitting probabilities fulfil the generalised Kounnas-Ross 𝒩 = 1 supersymmetry relations hinting to the fact that the underlying theory can be formulated in an explicit supersymmetric manner.

  5. Generalised universality of gauge thresholds in heterotic vacua with and without supersymmetry

    CERN Document Server

    Angelantonj, Carlo; Tsulaia, Mirian

    2015-01-01

    We study one-loop quantum corrections to gauge couplings in heterotic vacua with spontaneous supersymmetry breaking. Although in non-supersymmetric constructions these corrections are not protected and are typically model dependent, we show how a universal behaviour of threshold differences, typical of supersymmetric vacua, may still persist. We formulate specific conditions on the way supersymmetry should be broken for this to occur. Our analysis implies a generalised notion of threshold universality even in the case of unbroken supersymmetry, whenever extra charged massless states appear at enhancement points in the bulk of moduli space. Several examples with universality, including non-supersymmetric chiral models in four dimensions, are presented.

  6. Analysis of the diversity of substrate utilisation of soil bacteria exposed to Cd and earthworm activity using generalised additive models.

    Directory of Open Access Journals (Sweden)

    Selene Muñiz

    Full Text Available Biolog EcoPlates™ can be used to measure the carbon substrate utilisation patterns of microbial communities. This method results in a community-level physiological profile (CLPP, which yields a very large amount of data that may be difficult to interpret. In this work, we explore a combination of statistical techniques (particularly the use of generalised additive models [GAMs] to improve the exploitation of CLPP data. The strength of GAMs lies in their ability to address highly non-linear relationships between the response and the set of explanatory variables. We studied the impact of earthworms (Aporrectodea caliginosa Savigny 1826 and cadmium (Cd on the CLPP of soil bacteria. The results indicated that both Cd and earthworms modified the CLPP. GAMs were used to assess time-course changes in the diversity of substrate utilisation (DSU using the Shannon-Wiener index. GAMs revealed significant differences for all treatments (compared to control -S-. The Cd exposed microbial community presented very high metabolic capacities on a few substrata, resulting in an initial acute decrease of DSU (i.e. intense utilization of a few carbon substrata. After 54 h, and over the next 43 h the increase of the DSU suggest that other taxa, less dominant, reached high numbers in the wells containing sources that are less suitable for the Cd-tolerant taxa. Earthworms were a much more determining factor in explaining time course changes in DSU than Cd. Accordingly, Ew and EwCd soils presented similar trends, regardless the presence of Cd. Moreover, both treatments presented similar number of bacteria and higher than Cd-treated soils. This experimental approach, based on the use of DSU and GAMs allowed for a global and statistically relevant interpretation of the changes in carbon source utilisation, highlighting the key role of earthworms on the protection of microbial communities against the Cd.

  7. Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis

    Directory of Open Access Journals (Sweden)

    Beugnet Frédéric

    2016-01-01

    Full Text Available The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard®, Merial administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate®, Bayer. Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period.

  8. Efficacy of oral afoxolaner for the treatment of canine generalised demodicosis.

    Science.gov (United States)

    Beugnet, Frédéric; Halos, Lénaïg; Larsen, Diane; de Vos, Christa

    2016-01-01

    The efficacy of oral treatment with a chewable tablet containing afoxolaner 2.27% w/w (NexGard(®), Merial) administered orally was assessed in eight dogs diagnosed with generalised demodicosis and compared with efficacy in eight dogs under treatment with a topical combination of imidacloprid/moxidectin (Advocate(®), Bayer). Afoxolaner was administered at the recommended dose (at least 2.5 mg/kg) on Days 0, 14, 28 and 56. The topical combination of imidacloprid/moxidectin was given at the same intervals at the recommended concentration. Clinical examinations and deep skin scrapings were performed every month in order to evaluate the effect on mite numbers and the resolution of clinical signs. The percentage reductions of mite counts were 99.2%, 99.9% and 100% on Days 28, 56 and 84, respectively, in the afoxolaner-treated group, compared to 89.8%, 85.2% and 86.6% on Days 28, 56 and 84 in the imidacloprid/moxidectin-treated group. Skin condition of the dogs also improved significantly from Day 28 to Day 84 in the afoxolaner-treated group. Mite reductions were significantly higher on Days 28, 56 and 84 in the afoxolaner-treated group compared to the imidacloprid/moxidectin-treated group. The results of this study demonstrated that afoxolaner, given orally, was effective in treating dogs with generalised demodicosis within a two-month period. © F. Beugnet et al., published by EDP Sciences, 2016.

  9. A Generalised Fault Protection Structure Proposed for Uni-grounded Low-Voltage AC Microgrids

    Science.gov (United States)

    Bui, Duong Minh; Chen, Shi-Lin; Lien, Keng-Yu; Jiang, Jheng-Lun

    2016-04-01

    This paper presents three main configurations of uni-grounded low-voltage AC microgrids. Transient situations of a uni-grounded low-voltage (LV) AC microgrid (MG) are simulated through various fault tests and operation transition tests between grid-connected and islanded modes. Based on transient simulation results, available fault protection methods are proposed for main and back-up protection of a uni-grounded AC microgrid. In addition, concept of a generalised fault protection structure of uni-grounded LVAC MGs is mentioned in the paper. As a result, main contributions of the paper are: (i) definition of different uni-grounded LVAC MG configurations; (ii) analysing transient responses of a uni-grounded LVAC microgrid through line-to-line faults, line-to-ground faults, three-phase faults and a microgrid operation transition test, (iii) proposing available fault protection methods for uni-grounded microgrids, such as: non-directional or directional overcurrent protection, under/over voltage protection, differential current protection, voltage-restrained overcurrent protection, and other fault protection principles not based on phase currents and voltages (e.g. total harmonic distortion detection of currents and voltages, using sequence components of current and voltage, 3I0 or 3V0 components), and (iv) developing a generalised fault protection structure with six individual protection zones to be suitable for different uni-grounded AC MG configurations.

  10. Hybrid Generalised Additive Type-2 Fuzzy-Wavelet-Neural Network in Dynamic Data Mining

    Directory of Open Access Journals (Sweden)

    Bodyanskiy Yevgeniy

    2015-12-01

    Full Text Available In the paper, a new hybrid system of computational intelligence is proposed. This system combines the advantages of neuro-fuzzy system of Takagi-Sugeno-Kang, type-2 fuzzy logic, wavelet neural networks and generalised additive models of Hastie-Tibshirani. The proposed system has universal approximation properties and learning capability based on the experimental data sets which pertain to the neural networks and neuro-fuzzy systems; interpretability and transparency of the obtained results due to the soft computing systems and, first of all, due to type-2 fuzzy systems; possibility of effective description of local signal and process features due to the application of systems based on wavelet transform; simplicity and speed of learning process due to generalised additive models. The proposed system can be used for solving a wide class of dynamic data mining tasks, which are connected with non-stationary, nonlinear stochastic and chaotic signals. Such a system is sufficiently simple in numerical implementation and is characterised by a high speed of learning and information processing.

  11. Navigation towards a goal position: from reactive to generalised learned control

    Energy Technology Data Exchange (ETDEWEB)

    Freire da Silva, Valdinei [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil); Selvatici, Antonio Henrique [Universidade Nove de Julho, Rua Vergueiro, 235, Sao Paulo (Brazil); Reali Costa, Anna Helena, E-mail: valdinei.freire@gmail.com, E-mail: antoniohps@uninove.br, E-mail: anna.reali@poli.usp.br [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil)

    2011-03-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  12. Navigation towards a goal position: from reactive to generalised learned control

    International Nuclear Information System (INIS)

    Freire da Silva, Valdinei; Selvatici, Antonio Henrique; Reali Costa, Anna Helena

    2011-01-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  13. Source term estimation and the isotopic ratio of radioactive material released from the WIPP repository in New Mexico, USA.

    Science.gov (United States)

    Thakur, P

    2016-01-01

    After almost 15 years of operations, the Waste Isolation Pilot Plant (WIPP) had one of its waste drums breach underground as a result of a runaway chemical reaction in the waste it contained. This incident occurred on February 14, 2014. Moderate levels of radioactivity were released into the underground air. A small portion of the contaminated underground air also escaped to the surface through the ventilation system and was detected approximately 1 km away from the facility. According to the source term estimation, the actual amount of radioactivity released from the WIPP site was less than 1.5 mCi. The highest activity detected on the surface was 115.2 μBq/m(3) for (241)Am and 10.2 μBq/m(3) for (239+240)Pu at a sampling station located 91 m away from the underground air exhaust point and 81.4 μBq/m(3) of (241)Am and 5.8 μBq/m(3) of (239+240)Pu at a monitoring station located approximately 1 km northwest of the WIPP facility. The dominant radionuclides released were americium and plutonium, in a ratio that matches the content of the breached drum. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment. In this paper, the early stage monitoring data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and an oversight monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP) LLC were utilized to estimate the actual amount of radioactivity released from the WIPP underground. The Am and Pu isotope ratios were measured and used to support the hypothesis that the release came from one drum identified as having breached that represents a specific waste stream with this radionuclide ratio in its inventory. This failed drum underwent a heat and gas producing reaction that overpowered its vent and

  14. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  15. Workshop on the source term for radionuclide migration from high-level waste or spent nuclear fuel under realistic repository conditions: proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, T.O.; Muller, A.B. (eds.)

    1985-07-01

    Sixteen papers were presented at the workshop. The fourteen full-length papers included in the proceedings were processed separately. Only abstracts were included for the following two papers: Data Requirements Based on Performance Assessment Analyses of Conceptual Waste Packages in Salt Repositories, and The Potential Effects of Radiation on the Source Term in a Salt Repository. (LM)

  16. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nevada, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Peter Martian

    2009-05-01

    This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for CAU 97: Yucca Flat/Climax Mine. The total residual inventory of radionuclides associated with one or more tests is known as the radiologic source term (RST). The RST is comprised of radionuclides in water, glass, or other phases or mineralogic forms. The hydrologic source term (HST) of an underground nuclear test is the portion of the total RST that is released into the groundwater over time following the test. In this report, the HST represents radionuclide release some time after the explosion and does not include the rapidly evolving mechanical, thermal, and chemical processes during the explosion. The CAU 97: Yucca Flat/Climax Mine has many more detonations and a wider variety of settings to consider compared to other CAUs. For instance, the source term analysis and evaluation performed for CAUs 101 and 102: Central and Western Pahute Mesa and CAU 98: Frenchman Flat did not consider vadose zone attenuation because many detonations were located near or below the water table. However, the large number of Yucca Flat/Climax Mine tests and the location of many tests above the water table warrant a more robust analysis of the unsaturated zone.

  17. Linear-scaling calculation of Hartree-Fock exchange energy with non-orthogonal generalised Wannier functions.

    Science.gov (United States)

    Dziedzic, J; Hill, Q; Skylaris, C-K

    2013-12-07

    We present a method for the calculation of four-centre two-electron repulsion integrals in terms of localised non-orthogonal generalised Wannier functions (NGWFs). Our method has been implemented in the ONETEP program and is used to compute the Hartree-Fock exchange energy component of Hartree-Fock and Density Functional Theory (DFT) calculations with hybrid exchange-correlation functionals. As the NGWFs are optimised in situ in terms of a systematically improvable basis set which is equivalent to plane waves, it is possible to achieve large basis set accuracy in routine calculations. The spatial localisation of the NGWFs allows us to exploit the exponential decay of the density matrix in systems with a band gap in order to compute the exchange energy with a computational effort that increases linearly with the number of atoms. We describe the implementation of this approach in the ONETEP program for linear-scaling first principles quantum mechanical calculations. We present extensive numerical validation of all the steps in our method. Furthermore, we find excellent agreement in energies and structures for a wide variety of molecules when comparing with other codes. We use our method to perform calculations with the B3LYP exchange-correlation functional for models of myoglobin systems bound with O2 and CO ligands and confirm that the same qualitative behaviour is obtained as when the same myoglobin models are studied with the DFT+U approach which is also available in ONETEP. Finally, we confirm the linear-scaling capability of our method by performing calculations on polyethylene and polyacetylene chains of increasing length.

  18. Linear-scaling calculation of Hartree-Fock exchange energy with non-orthogonal generalised Wannier functions

    International Nuclear Information System (INIS)

    Dziedzic, J.; Hill, Q.; Skylaris, C.-K.

    2013-01-01

    We present a method for the calculation of four-centre two-electron repulsion integrals in terms of localised non-orthogonal generalised Wannier functions (NGWFs). Our method has been implemented in the ONETEP program and is used to compute the Hartree-Fock exchange energy component of Hartree-Fock and Density Functional Theory (DFT) calculations with hybrid exchange-correlation functionals. As the NGWFs are optimised in situ in terms of a systematically improvable basis set which is equivalent to plane waves, it is possible to achieve large basis set accuracy in routine calculations. The spatial localisation of the NGWFs allows us to exploit the exponential decay of the density matrix in systems with a band gap in order to compute the exchange energy with a computational effort that increases linearly with the number of atoms. We describe the implementation of this approach in the ONETEP program for linear-scaling first principles quantum mechanical calculations. We present extensive numerical validation of all the steps in our method. Furthermore, we find excellent agreement in energies and structures for a wide variety of molecules when comparing with other codes. We use our method to perform calculations with the B3LYP exchange-correlation functional for models of myoglobin systems bound with O 2 and CO ligands and confirm that the same qualitative behaviour is obtained as when the same myoglobin models are studied with the DFT+U approach which is also available in ONETEP. Finally, we confirm the linear-scaling capability of our method by performing calculations on polyethylene and polyacetylene chains of increasing length

  19. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    Energy Technology Data Exchange (ETDEWEB)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventory of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.

  20. Analyzing the contribution of climate change to long-term variations in sediment nitrogen sources for reservoirs/lakes

    International Nuclear Information System (INIS)

    Xia, Xinghui; Wu, Qiong; Zhu, Baotong; Zhao, Pujun; Zhang, Shangwei; Yang, Lingyan

    2015-01-01

    We applied a mixing model based on stable isotopic δ 13 C, δ 15 N, and C:N ratios to estimate the contributions of multiple sources to sediment nitrogen. We also developed a conceptual model describing and analyzing the impacts of climate change on nitrogen enrichment. These two models were conducted in Miyun Reservoir to analyze the contribution of climate change to the variations in sediment nitrogen sources based on two 210 Pb and 137 Cs dated sediment cores. The results showed that during the past 50 years, average contributions of soil and fertilizer, submerged macrophytes, N 2 -fixing phytoplankton, and non-N 2 -fixing phytoplankton were 40.7%, 40.3%, 11.8%, and 7.2%, respectively. In addition, total nitrogen (TN) contents in sediment showed significant increasing trends from 1960 to 2010, and sediment nitrogen of both submerged macrophytes and phytoplankton sources exhibited significant increasing trends during the past 50 years. In contrast, soil and fertilizer sources showed a significant decreasing trend from 1990 to 2010. According to the changing trend of N 2 -fixing phytoplankton, changes of temperature and sunshine duration accounted for at least 43% of the trend in the sediment nitrogen enrichment over the past 50 years. Regression analysis of the climatic factors on nitrogen sources showed that the contributions of precipitation, temperature, and sunshine duration to the variations in sediment nitrogen sources ranged from 18.5% to 60.3%. The study demonstrates that the mixing model provides a robust method for calculating the contribution of multiple nitrogen sources in sediment, and this study also suggests that N 2 -fixing phytoplankton could be regarded as an important response factor for assessing the impacts of climate change on nitrogen enrichment. - Highlights: • A mixing model was built to analyze sediment N sources of lakes/reservoirs. • Fertilizer/soil and macrophytes showed decreasing trends during the past two decades.