WorldWideScience

Sample records for homogeneity analysis verification

  1. At-tank Low-Activity Feed Homogeneity Analysis Verification

    International Nuclear Information System (INIS)

    DOUGLAS, J.G.

    2000-01-01

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements

  2. Verification of homogenization in fast critical assembly analyses

    International Nuclear Information System (INIS)

    Chiba, Go

    2006-01-01

    In the present paper, homogenization procedures for fast critical assembly analyses are investigated. Errors caused by homogenizations are evaluated by the exact perturbation theory. In order to obtain reference solutions, three-dimensional plate-wise transport calculations are performed. It is found that the angular neutron flux along plate boundaries has a significant peak in the fission source energy range. To treat this angular dependence accurately, the double-Gaussian Chebyshev angular quadrature set with S 24 is applied. It is shown that the difference between the heterogeneous leakage theory and the homogeneous theory is negligible, and that transport cross sections homogenized with neutron flux significantly underestimate neutron leakage. The error in criticality caused by a homogenization is estimated at about 0.1%Δk/kk' in a small fast critical assembly. In addition, the neutron leakage is overestimated by both leakage theories when sodium plates in fuel lattices are voided. (author)

  3. Qualitative analysis of homogeneous universes

    International Nuclear Information System (INIS)

    Novello, M.; Araujo, R.A.

    1980-01-01

    The qualitative behaviour of cosmological models is investigated in two cases: Homogeneous and isotropic Universes containing viscous fluids in a stokesian non-linear regime; Rotating expanding universes in a state which matter is off thermal equilibrium. (Author) [pt

  4. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  5. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  6. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  7. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  8. Tensor harmonic analysis on homogenous space

    International Nuclear Information System (INIS)

    Wrobel, G.

    1997-01-01

    The Hilbert space of tensor functions on a homogenous space with the compact stability group is considered. The functions are decomposed onto a sum of tensor plane waves (defined in the text), components of which are transformed by irreducible representations of the appropriate transformation group. The orthogonality relation and the completeness relation for tensor plane waves are found. The decomposition constitutes a unitary transformation, which allows to obtain the Parseval equality. The Fourier components can be calculated by means of the Fourier transformation, the form of which is given explicitly. (author)

  9. Finite Element Verification of Non-Homogeneous Strain and Stress Fields during Composite Material Testing

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    2015-01-01

    Uni-directional glass fiber reinforced polymers play a central role in the task increasing the length of wind turbines blades and thereby lowering the cost of energy from wind turbine installations. During this, optimizing the mechanical performance regarding material stiffness, compression...... strength and fatigue performance is essential. Nevertheless, testing composites includes some challenges regarding stiffness determination using conventional strain gauges and achieving correct material failure unaffected by the gripping region during fatigue testing. Challenges, which in the present study......, has been addressed using the finite element method. During this, a verification of experimental observations, a deeper understanding on the test coupon loading and thereby improved test methods has been achieved....

  10. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  11. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  12. Homogeneous nucleation in 4He: A corresponding-states analysis

    International Nuclear Information System (INIS)

    Sinha, D.N.; Semura, J.S.; Brodie, L.C.

    1982-01-01

    We report homogeneous-nucleation-temperature measurements in liquid 4 He over a bath-temperature range 2.31 4 He, in a region far from the critical point. A simple empirical form is presented for estimating the homogeneous nucleation temperatures for any liquid with a spherically symmetric interatomic potential. The 4 He data are compared with nucleation data for Ar, Kr, Xe, and H; theoretical predictions for 3 He are given in terms of reduced quantities. It is shown that the nucleation data for both quantum and classical liquids obey a quantum law of corresponding states (QCS). On the basis of this QCS analysis, predictions of homogeneous nucleation temperatures are made for hydrogen isotopes such as HD, DT, HT, and T 2

  13. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  14. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  15. Homogeneity evaluation of mesenchymal stem cells based on electrotaxis analysis

    OpenAIRE

    Kim, Min Sung; Lee, Mi Hee; Kwon, Byeong-Ju; Kim, Dohyun; Koo, Min-Ah; Seon, Gyeung Mi; Park, Jong-Chul

    2017-01-01

    Stem cell therapy that can restore function to damaged tissue, avoid host rejection and reduce inflammation throughout body without use of immunosuppressive drugs. The established methods were used to identify and to isolate specific stem cell markers by FACS or by immunomagnetic cell separation. The procedures for distinguishing population of stem cells took a time and needed many preparations. Here we suggest an electrotaxis analysis as a new method to evaluate the homogeneity of mesenchyma...

  16. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  17. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  18. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  19. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  20. Geostatistical Analysis Methods for Estimation of Environmental Data Homogeneity

    Directory of Open Access Journals (Sweden)

    Aleksandr Danilov

    2018-01-01

    Full Text Available The methodology for assessing the spatial homogeneity of ecosystems with the possibility of subsequent zoning of territories in terms of the degree of disturbance of the environment is considered in the study. The degree of pollution of the water body was reconstructed on the basis of hydrochemical monitoring data and information on the level of the technogenic load in one year. As a result, the greatest environmental stress zones were isolated and correct zoning using geostatistical analysis techniques was proved. Mathematical algorithm computing system was implemented in an object-oriented programming C #. A software application has been obtained that allows quickly assessing the scale and spatial localization of pollution during the initial analysis of the environmental situation.

  1. Homogeneous protein analysis by magnetic core-shell nanorod probes

    KAUST Repository

    Schrittwieser, Stefan

    2016-03-29

    Studying protein interactions is of vital importance both to fundamental biology research and to medical applications. Here, we report on the experimental proof of a universally applicable label-free homogeneous platform for rapid protein analysis. It is based on optically detecting changes in the rotational dynamics of magnetically agitated core-shell nanorods upon their specific interaction with proteins. By adjusting the excitation frequency, we are able to optimize the measurement signal for each analyte protein size. In addition, due to the locking of the optical signal to the magnetic excitation frequency, background signals are suppressed, thus allowing exclusive studies of processes at the nanoprobe surface only. We study target proteins (soluble domain of the human epidermal growth factor receptor 2 - sHER2) specifically binding to antibodies (trastuzumab) immobilized on the surface of our nanoprobes and demonstrate direct deduction of their respective sizes. Additionally, we examine the dependence of our measurement signal on the concentration of the analyte protein, and deduce a minimally detectable sHER2 concentration of 440 pM. For our homogeneous measurement platform, good dispersion stability of the applied nanoprobes under physiological conditions is of vital importance. To that end, we support our measurement data by theoretical modeling of the total particle-particle interaction energies. The successful implementation of our platform offers scope for applications in biomarker-based diagnostics as well as for answering basic biology questions.

  2. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.; Wehbe, Nimer; Hussain, Muhammad Mustafa

    2015-01-01

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn

  3. Non-homogeneous harmonic analysis: 16 years of development

    Science.gov (United States)

    Volberg, A. L.; Èiderman, V. Ya

    2013-12-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles.

  4. Modeling the dynamics of internal flooding - verification analysis

    International Nuclear Information System (INIS)

    Filipov, K.

    2011-01-01

    The results from conducted software WATERFOW's verification analysis, developed for the purposes of reactor building internal flooding analysis have been presented. For the purpose of benchmarking the integrated code MELCOR is selected. Considering the complex structure of reactor building, the sample tests were used to cover the characteristic points of the internal flooding analysis. The inapplicability of MELCOR to the internal flooding study has been proved

  5. Cross section homogenization analysis for a simplified Candu reactor

    International Nuclear Information System (INIS)

    Pounders, Justin; Rahnema, Farzad; Mosher, Scott; Serghiuta, Dumitru; Turinsky, Paul; Sarsour, Hisham

    2008-01-01

    The effect of using zero current (infinite medium) boundary conditions to generate bundle homogenized cross sections for a stylized half-core Candu reactor problem is examined. Homogenized cross section from infinite medium lattice calculations are compared with cross sections homogenized using the exact flux from the reference core environment. The impact of these cross section differences is quantified by generating nodal diffusion theory solutions with both sets of cross sections. It is shown that the infinite medium spatial approximation is not negligible, and that ignoring the impact of the heterogeneous core environment on cross section homogenization leads to increased errors, particularly near control elements and the core periphery. (authors)

  6. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  7. Homogeneous Photodynamical Analysis of Kepler's Multiply-Transiting Systems

    Science.gov (United States)

    Ragozzine, Darin

    To search for planets more like our own, NASA s Kepler Space Telescope ( Kepler ) discovered thousands of exoplanet candidates that cross in front of ( transit ) their parent stars (e.g., Twicken et al. 2016). The Kepler exoplanet data represent an incredible observational leap forward as evidenced by hundreds of papers with thousands of citations. In particular, systems with multiple transiting planets combine the determination of physical properties of exoplanets (e.g., radii), the context provided by the system architecture, and insights from orbital dynamics. Such systems are the most information-rich exoplanetary systems (Ragozzine & Holman 2010). Thanks to Kepler s revolutionary dataset, understanding these Multi-Transiting Systems (MTSs) enables a wide variety of major science questions. In conclusion, existing analyses of MTSs are incomplete and suboptimal and our efficient and timely proposal will provide significant scientific gains ( 100 new mass measurements and 100 updated mass measurements). Furthermore, our homogeneous analysis enables future statistical analyses, including those necessary to characterize the small planet mass-radius relation with implications for understanding the formation, evolution, and habitability of planets. The overarching goal of this proposal is a complete homogeneous investigation of Kepler MTSs to provide detailed measurements (or constraints) on exoplanetary physical and orbital properties. Current investigations do not exploit the full power of the Kepler data; here we propose to use better data (Short Cadence observations), better methods (photodynamical modeling), and a better statistical method (Bayesian Differential Evolution Markov Chain Monte Carlo) in a homogenous analysis of all 700 Kepler MTSs. These techniques are particularly valuable for understanding small terrestrial planets. We propose to extract the near-maximum amount of information from these systems through a series of three research objectives

  8. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    MacLean, G.; Fergusson, J.

    1998-01-01

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  9. Assessment of homogeneity of regions for regional flood frequency analysis

    Science.gov (United States)

    Lee, Jeong Eun; Kim, Nam Won

    2016-04-01

    This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  10. Fire-accident analysis code (FIRAC) verification

    International Nuclear Information System (INIS)

    Nichols, B.D.; Gregory, W.S.; Fenton, D.L.; Smith, P.R.

    1986-01-01

    The FIRAC computer code predicts fire-induced transients in nuclear fuel cycle facility ventilation systems. FIRAC calculates simultaneously the gas-dynamic, material transport, and heat transport transients that occur in any arbitrarily connected network system subjected to a fire. The network system may include ventilation components such as filters, dampers, ducts, and blowers. These components are connected to rooms and corridors to complete the network for moving air through the facility. An experimental ventilation system has been constructed to verify FIRAC and other accident analysis codes. The design emphasizes network system characteristics and includes multiple chambers, ducts, blowers, dampers, and filters. A larger industrial heater and a commercial dust feeder are used to inject thermal energy and aerosol mass. The facility is instrumented to measure volumetric flow rate, temperature, pressure, and aerosol concentration throughout the system. Aerosol release rates and mass accumulation on filters also are measured. We have performed a series of experiments in which a known rate of thermal energy is injected into the system. We then simulated this experiment with the FIRAC code. This paper compares and discusses the gas-dynamic and heat transport data obtained from the ventilation system experiments with those predicted by the FIRAC code. The numerically predicted data generally are within 10% of the experimental data

  11. Verification and validation of COBRA-SFS transient analysis capability

    International Nuclear Information System (INIS)

    Rector, D.R.; Michener, T.E.; Cuta, J.M.

    1998-05-01

    This report provides documentation of the verification and validation testing of the transient capability in the COBRA-SFS code, and is organized into three main sections. The primary documentation of the code was published in September 1995, with the release of COBRA-SFS, Cycle 2. The validation and verification supporting the release and licensing of COBRA-SFS was based solely on steady-state applications, even though the appropriate transient terms have been included in the conservation equations from the first cycle. Section 2.0, COBRA-SFS Code Description, presents a capsule description of the code, and a summary of the conservation equations solved to obtain the flow and temperature fields within a cask or assembly model. This section repeats in abbreviated form the code description presented in the primary documentation (Michener et al. 1995), and is meant to serve as a quick reference, rather than independent documentation of all code features and capabilities. Section 3.0, Transient Capability Verification, presents a set of comparisons between code calculations and analytical solutions for selected heat transfer and fluid flow problems. Section 4.0, Transient Capability Validation, presents comparisons between code calculations and experimental data obtained in spent fuel storage cask tests. Based on the comparisons presented in Sections 2.0 and 3.0, conclusions and recommendations for application of COBRA-SFS to transient analysis are presented in Section 5.0

  12. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  13. Analysis of an homogeneous solution reactor for 99 Mo production

    International Nuclear Information System (INIS)

    Weir, A.; Lopasso, E.; Gho, C.

    2007-01-01

    The 99m Tc is the more used radioisotope in nuclear medicine, used in 80% of procedures of nuclear medicine in the world. This is due to their characteristics practically ideal for the diagnostic. The 99m Tc is obtained by decay of the 99 Mo, which can produce it by irradiating enriched targets in 98 Mo, or as fission product, irradiating uranium targets or by means of homogeneous solution reactors. The pattern of the used reactor in the neutron analysis possesses a liquid fuel composed of uranyl nitrate dissolved in water with the attach of nitric acid. This solution is contained in a cylindrical recipient of stainless steel reflected with light water. The reactor is refrigerated by means of an helicoidal heat exchanger immersed in the fuel solution. The heat of the fuel is removed by natural convection while the circulation of the water inside the exchanger is forced. The control system of the reactor consists on 6 independent cadmium bars, with followers of water. An auxiliary control system can be the level of the fuel solution inside container tank, but it was not included in the pattern in study. One studies the variations of the reactivity of the system due to different phenomena. An important factor during the normal operation of the reactor is the variation of temperature taking to a volumetric expansion of the fuel and ghastly effects in the same one. Another causing phenomenon of changes in the reactivity is the variation of the concentration of uranium in the combustible solution. An important phenomenon in this type of reactors is the hole fraction in the nucleus I liquidate due to the radiolysis and the possible boil of the water of the combustible solution. Some of the possible cases of abnormal operation were studied as the lost one of coolant in the secondary circuit of the heat exchanger, the introduction and evaporation of water in the nucleus. The reactivity variations were studied using the codes of I calculate MCNP, WIMS and TORT. All the

  14. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    2015-01-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed. This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.

  15. Homogenizing bacterial cell factories: Analysis and engineering of phenotypic heterogeneity.

    Science.gov (United States)

    Binder, Dennis; Drepper, Thomas; Jaeger, Karl-Erich; Delvigne, Frank; Wiechert, Wolfgang; Kohlheyer, Dietrich; Grünberger, Alexander

    2017-07-01

    In natural habitats, microbes form multispecies communities that commonly face rapidly changing and highly competitive environments. Thus, phenotypic heterogeneity has evolved as an innate and important survival strategy to gain an overall fitness advantage over cohabiting competitors. However, in defined artificial environments such as monocultures in small- to large-scale bioreactors, cell-to-cell variations are presumed to cause reduced production yields as well as process instability. Hence, engineering microbial production toward phenotypic homogeneity is a highly promising approach for synthetic biology and bioprocess optimization. In this review, we discuss recent studies that have unraveled the cell-to-cell heterogeneity observed during bacterial gene expression and metabolite production as well as the molecular mechanisms involved. In addition, current single-cell technologies are briefly reviewed with respect to their applicability in exploring cell-to-cell variations. We highlight emerging strategies and tools to reduce phenotypic heterogeneity in biotechnological expression setups. Here, strain or inducer modifications are combined with cell physiology manipulations to achieve the ultimate goal of equalizing bacterial populations. In this way, the majority of cells can be forced into high productivity, thus reducing less productive subpopulations that tend to consume valuable resources during production. Modifications in uptake systems, inducer molecules or nutrients represent valuable tools for diminishing heterogeneity. Finally, we address the challenge of transferring homogeneously responding cells into large-scale bioprocesses. Environmental heterogeneity originating from extrinsic factors such as stirring speed and pH, oxygen, temperature or nutrient distribution can significantly influence cellular physiology. We conclude that engineering microbial populations toward phenotypic homogeneity is an increasingly important task to take biotechnological

  16. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  17. Analysis of three idealized reactor configurations: plate, pin, and homogeneous

    International Nuclear Information System (INIS)

    McKnight, R.D.

    1983-01-01

    Detailed Monte Carlo calculations have been performed for three distinct configurations of an idealized fast critical assembly. This idealized assembly was based on the LMFBR benchmark critical assembly ZPR-6/7. In the first configuration, the entire core was loaded with the plate unit cell of ZPR-6/7. In the second configuration, the entire core was loaded with the ZPR sodium-filled pin calandria. The actual ZPR pin calandria are loaded with mixed (U,Pu) oxide pins which closely match the composition of the ZPR-6/7 plate unit cell. For the present study, slight adjustments were made in the atom concentrations and the length of the pin calandria in order to make the core boundaries and average composition for the pin-cell configuration identical to those of the plate-cell configuration. In the third configuration, the core was homogeneous, again with identical core boundaries and average composition as the plate and pin configurations

  18. SiSn diodes: Theoretical analysis and experimental verification

    KAUST Repository

    Hussain, Aftab M.

    2015-08-24

    We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn. We report a reduction of 0.1 V in the average built-in potential, and a reduction of 0.2 V in the average reverse bias breakdown voltage, as measured across the substrate. These reductions indicate that the band gap of the silicon lattice has been reduced due to the incorporation of Sn, as expected from the theoretical analysis. We report the experimentally calculated band gap of SiSn to be 1.11 ± 0.09 eV. This low-cost, CMOS compatible, and scalable process offers a unique opportunity to tune the band gap of silicon for specific applications.

  19. The design and verification of probabilistic safety analysis platform NFRisk

    International Nuclear Information System (INIS)

    Hu Wenjun; Song Wei; Ren Lixia; Qian Hongtao

    2010-01-01

    To increase the technical ability in Probabilistic Safety Analysis (PSA) field in China,it is necessary and important to study and develop indigenous professional PSA platform. Following such principle as 'from structure simplification to modulization to production of cut sets to minimum of cut sets', the algorithms, including simplification algorithm, modulization algorithm, the algorithm of conversion from fault tree to binary decision diagram (BDD), the solving algorithm of cut sets, the minimum algorithm of cut sets, and so on, were designed and developed independently; the design of data management and operation platform was completed all alone; the verification and validation of NFRisk platform based on 3 typical fault trees was finished on our own. (authors)

  20. Regional homogeneity of electoral space: comparative analysis (on the material of 100 national cases

    Directory of Open Access Journals (Sweden)

    A. O. Avksentiev

    2015-12-01

    Full Text Available In the article the author examines dependence on electoral behavior from territorial belonging. «Regional homogeneity» and «electoral space» categories are conceptualized. It is argued, that such regional homogeneity is a characteristic of electoral space and can be quantified. Quantitative measurement of government regional homogeneity has direct connection with risk of separatism, civil conflicts, or legitimacy crisis on deviant territories. It is proposed the formulae for evaluation of regional homogeneity quantitative method which has been based on statistics analysis instrument, especially, variation coefficient. Possible directions of study with the use of this index according to individual political subjects and the whole political space (state, region, electoral district are defined. Calculation of appropriate indexes for Ukrainian electoral space (return of 1991­2015 elections and 100 other national cases. The dynamics of Ukraine regional homogeneity on the material of 1991­2015 electoral statistics is analyzed.

  1. Structural analysis of a homogeneous polysaccharide from Achatina fulica.

    Science.gov (United States)

    Liu, Jie; Shang, Feineng; Yang, Zengming; Wu, Mingyi; Zhao, Jinhua

    2017-05-01

    Edible snails have been widely used as a health food and medicine in many countries. In our study, a water-soluble polysaccharide (AF-1) was isolated and purified from Achatina fulica by papain enzymolysis, alcohol precipitation and strong anion exchange chromatography. Structureof the polysaccharide was analyzed and characterized by chemical and instrumental methods, such as Fourier transform infrared spectroscopy, high performance liquid chromatography, analysis of monosaccharide composition, methylation analysis, and nuclear magnetic resonance (NMR) spectroscopy ( 1 H, 13 C, COSY, TOCSY, NOESY, HSQC and HMBC). Chemical composition analysis indicated that AF-1 is composed of glucose (Glc) and its average molecular weight is 1710kDa. Structural analysis suggested that AF-1 is mainly consisted of a linear repeating backbone of (1→4) linked α-d-Glc p residues with one branch, α-d-Glc p, attached to the main chain by (1→6) glycosidic bonds at every five main-chain units. Further studies on biological activities of the polysaccharide are currently in progress. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  3. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  4. The colour analysis method applied to homogeneous rocks

    Directory of Open Access Journals (Sweden)

    Halász Amadé

    2015-12-01

    Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  5. Analysis of Homogeneous BFS-73-1 MA Benchmark Core

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeong Il; Yoo, Jae Woon; Song, Hoon; Jang, Jin Wook; Kim, Yeong Il

    2007-06-15

    Analysis of BFS-73-1 critical assembly for MA transmutation has been carried out by using K-CORE system mainly, DIF3D code. All of measured data are compared with the results of analysis and sensitiveness of calculation conditions, for example, number of neutron energy groups, mesh size used, and analysis method, are assessed. Effective multiplication factor was in good agreement within experimental uncertainty in both transport and diffusion calculations. Fission rate distribution of U-235 and U-238 is also fairly good agreed with experimental results within maximum 5% in core region. But large discrepancy was seen in blanket region and it tends to increase as the location closes to core boundary. Largest error of relative reaction rate ratio was seen in Am-243 fission and U-238 capture. For the case of Am-243, the error lay on appropriate range considering the measurement uncertainty of that as 4.6%. Sample reactivity worths for scattering dominant isotope was greatly differ from the experimental results, which can be explained in terms of sample heterogeneity effect, sample self shielding and finally resonance bilinear correction effect. These effects will be evaluated as future study. C/E of effective delayed neutron fraction is within 4%, which is within the measurement uncertainty.

  6. Analysis of Homogeneous BFS-73-1 MA Benchmark Core

    International Nuclear Information System (INIS)

    Kim, Yeong Il; Yoo, Jae Woon; Song, Hoon; Jang, Jin Wook; Kim, Yeong Il

    2007-06-01

    Analysis of BFS-73-1 critical assembly for MA transmutation has been carried out by using K-CORE system mainly, DIF3D code. All of measured data are compared with the results of analysis and sensitiveness of calculation conditions, for example, number of neutron energy groups, mesh size used, and analysis method, are assessed. Effective multiplication factor was in good agreement within experimental uncertainty in both transport and diffusion calculations. Fission rate distribution of U-235 and U-238 is also fairly good agreed with experimental results within maximum 5% in core region. But large discrepancy was seen in blanket region and it tends to increase as the location closes to core boundary. Largest error of relative reaction rate ratio was seen in Am-243 fission and U-238 capture. For the case of Am-243, the error lay on appropriate range considering the measurement uncertainty of that as 4.6%. Sample reactivity worths for scattering dominant isotope was greatly differ from the experimental results, which can be explained in terms of sample heterogeneity effect, sample self shielding and finally resonance bilinear correction effect. These effects will be evaluated as future study. C/E of effective delayed neutron fraction is within 4%, which is within the measurement uncertainty

  7. Homogeneity and heterogeneousness in European food cultures: An exploratory analysis

    DEFF Research Database (Denmark)

    Askegaard, Søren; Madsen, Tage Koed

    One type pf boundaries rarely explored in international marketing but of potentially vital importance to international marketing are the cultural boundaries dividing Europe into regions with indidvidual cultural background and different consumptui patterns. This paper explores information about...... such cultural patterns of food consumption based on information from an existing database originating from a 1989 pan-European life style suvey questioning around 20,000 people in 16 European countri divided into 79 regions. A factor analysis reduced the number of variables from 138 to 41, discovering...

  8. Analysis of intra-genomic GC content homogeneity within prokaryotes

    DEFF Research Database (Denmark)

    Bohlin, J; Snipen, L; Hardy, S.P.

    2010-01-01

    the GC content varies within microbial genomes to assess whether this property can be associated with certain biological functions related to the organism's environment and phylogeny. We utilize a new quantity GCVAR, the intra-genomic GC content variability with respect to the average GC content......Bacterial genomes possess varying GC content (total guanines (Gs) and cytosines (Cs) per total of the four bases within the genome) but within a given genome, GC content can vary locally along the chromosome, with some regions significantly more or less GC rich than on average. We have examined how...... both aerobic and facultative microbes. Although an association has previously been found between mean genomic GC content and oxygen requirement, our analysis suggests that no such association exits when phylogenetic bias is accounted for. A significant association between GCVAR and mean GC content...

  9. Triple Modular Redundancy verification via heuristic netlist analysis

    Directory of Open Access Journals (Sweden)

    Giovanni Beltrame

    2015-08-01

    Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.

  10. INF and IAEA: A comparative analysis of verification strategy

    International Nuclear Information System (INIS)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  11. Multipoint linkage analysis and homogeneity tests in 15 Dutch X-linked retinitis pigmentosa families

    NARCIS (Netherlands)

    Bergen, A. A.; van den Born, L. I.; Schuurman, E. J.; Pinckers, A. J.; van Ommen, G. J.; Bleekers-Wagemakers, E. M.; Sandkuijl, L. A.

    1995-01-01

    Linkage analysis and homogeneity tests were carried out in 15 Dutch families segregating X-linked retinitis pigmentosa (X L R P). The study included segregation data for eight polymorphic DNA markers from the short arm of the human X chromosome. The results of both multipoint linkage analysis in

  12. Homogeneity analysis with k sets of variables: An alternating least squares method with optimal scaling features

    NARCIS (Netherlands)

    van der Burg, Eeke; de Leeuw, Jan; Verdegaal, Renée

    1988-01-01

    Homogeneity analysis, or multiple correspondence analysis, is usually applied tok separate variables. In this paper we apply it to sets of variables by using sums within sets. The resulting technique is called OVERALS. It uses the notion of optimal scaling, with transformations that can be multiple

  13. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  14. Analysis of the premeability characteristics along rough-walled fractures using a homogenization method

    International Nuclear Information System (INIS)

    Chae, Byung Gon; Choi, Jung Hae; Ichikawa, Yasuaki; Seo, Yong Seok

    2012-01-01

    To compute a permeability coefficient along a rough fracture that takes into account the fracture geometry, this study performed detailed measurements of fracture roughness using a confocal laser scanning microscope, a quantitative analysis of roughness using a spectral analysis, and a homogenization analysis to calculate the permeability coefficient on the microand macro-scale. The homogenization analysis is a type of perturbation theory that characterizes the behavior of microscopically inhomogeneous material with a periodic boundary condition in the microstructure. Therefore, it is possible to analyze accurate permeability characteristics that are represented by the local effect of the fracture geometry. The Cpermeability coefficients that are calculated using the homogenization analysis for each rough fracture model exhibit an irregular distribution and do not follow the relationship of the cubic law. This distribution suggests that the permeability characteristics strongly depend on the geometric conditions of the fractures, such as the roughness and the aperture variation. The homogenization analysis may allow us to produce more accurate results than are possible with the preexisting equations for calculating permeability.

  15. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  16. Homogeneity Analysis of a MEMS-based PZT Thick Film Vibration Energy Harvester Manufacturing Process

    DEFF Research Database (Denmark)

    Lei, Anders; Xu, Ruichao; Borregaard, Louise M.

    2012-01-01

    This paper presents a homogeneity analysis of a high yield wafer scale fabrication of MEMS-based unimorph silicon/PZT thick film vibration energy harvesters aimed towards vibration sources with peak vibrations in the range of around 300Hz. A wafer with a yield of 91% (41/45 devices) has been...

  17. Nuclear-Thermal Analysis of Fully Ceramic Microencapsulated Fuel via Two-Temperature Homogenized Model

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Nam Zin

    2013-01-01

    The FCM fuel is based on a proven safety philosophy that has been utilized operationally in very high temperature reactors (VHTRs). However, the FCM fuel consists of TRISO particles randomly dispersed in SiC matrix. The high heterogeneity in composition leads to difficulty in explicit thermal calculation of such a fuel. Therefore, an appropriate homogenization model becomes essential. In this paper, we apply the two-temperature homogenized model to thermal analysis of an FCM fuel. The model was recently proposed in order to provide more realistic temperature profiles in the fuel element in VHTRs. We applied the two-temperature homogenized model to FCM fuel. The two-temperature homogenized model was obtained by particle transport Monte Carlo calculation applied to the pellet region consisting of many coated particles uniformly dispersed in SiC matrix. Since this model gives realistic temperature profiles in the pellet (providing fuel-kernel temperature and SiC matrix temperature distinctly), it can be used for more accurate neutronics evaluation such as Doppler temperature feedback. The transient thermal calculation may be performed also more realistically with temperature-dependent homogenized parameters in various scenarios

  18. Modeling and Verification of Insider Threats Using Logical Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2017-01-01

    and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...

  19. Frequency-dependant homogenized properties of composite using spectral analysis method

    International Nuclear Information System (INIS)

    Ben Amor, M; Ben Ghozlen, M H; Lanceleur, P

    2010-01-01

    An inverse procedure is proposed to determine the material constants of multilayered composites using a spectral analysis homogenization method. Recursive process gives interfacial displacement perpendicular to layers in term of deepness. A fast-Fourier transform (FFT) procedure has been used in order to extract the wave numbers propagating in the multilayer. The upper frequency bound of this homogenization domain is estimated. Inside the homogenization domain, we discover a maximum of three planes waves susceptible to propagate in the medium. A consistent algorithm is adopted to develop an inverse procedure for the determination of the materials constants of multidirectional composite. The extracted wave numbers are used as the inputs for the procedure. The outputs are the elastic constants of multidirectional composite. Using this method, the frequency dependent effective elastic constants are obtained and example for [0/90] composites is given.

  20. Development of dynamic explicit crystallographic homogenization finite element analysis code to assess sheet metal formability

    International Nuclear Information System (INIS)

    Nakamura, Yasunori; Tam, Nguyen Ngoc; Ohata, Tomiso; Morita, Kiminori; Nakamachi, Eiji

    2004-01-01

    The crystallographic texture evolution induced by plastic deformation in the sheet metal forming process has a great influence on its formability. In the present study, a dynamic explicit finite element (FE) analysis code is newly developed by introducing a crystallographic homogenization method to estimate the polycrystalline sheet metal formability, such as the extreme thinning and 'earing'. This code can predict the plastic deformation induced texture evolution at the micro scale and the plastic anisotropy at the macro scale, simultaneously. This multi-scale analysis can couple the microscopic crystal plasticity inhomogeneous deformation with the macroscopic continuum deformation. In this homogenization process, the stress at the macro scale is defined by the volume average of those of the corresponding microscopic crystal aggregations in satisfying the equation of motion and compatibility condition in the micro scale 'unit cell', where the periodicity of deformation is satisfied. This homogenization algorithm is implemented in the conventional dynamic explicit finite element code by employing the updated Lagrangian formulation and the rate type elastic/viscoplastic constitutive equation.At first, it has been confirmed through a texture evolution analyses in cases of typical deformation modes that Taylor's 'constant strain homogenization algorithm' yields extreme concentration toward the preferred crystal orientations compared with our homogenization one. Second, we study the plastic anisotropy effects on 'earing' in the hemispherical cup deep drawing process of pure ferrite phase sheet metal. By the comparison of analytical results with those of Taylor's assumption, conclusions are drawn that the present newly developed dynamic explicit crystallographic homogenization FEM shows more reasonable prediction of plastic deformation induced texture evolution and plastic anisotropy at the macro scale

  1. Infinite dimensional spherical analysis and harmonic analysis for groups acting on homogeneous trees

    DEFF Research Database (Denmark)

    Axelgaard, Emil

    In this thesis, we study groups of automorphisms for homogeneous trees of countable degree by using an inductive limit approach. The main focus is the thourough discussion of two Olshanski spherical pairs consisting of automorphism groups for a homogeneous tree and a homogeneous rooted tree, resp...... finite. Finally, we discuss conditionally positive definite functions on the groups and use the generalized Bochner-Godement theorem for Olshanski spherical pairs to prove Levy-Khinchine formulas for both of the considered pairs....

  2. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  3. Homogeneity study on biological candidate reference materials: the role of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Daniel P.; Moreira, Edson G., E-mail: dsilva.pereira@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Instrumental Neutron activation Analysis (INAA) is a mature nuclear analytical technique able to accurately determine chemical elements without the need of sample digestion and, hence, without the associated problems of analyte loss or contamination. This feature, along with its potentiality use as a primary method of analysis, makes it an important tool for the characterization of new references materials and in the assessment of their homogeneity status. In this study, the ability of the comparative method of INAA for the within-bottle homogeneity of K, Mg, Mn and V in a mussel reference material was investigated. Method parameters, such as irradiation time, sample decay time and distance from sample to the detector were varied in order to allow element determination in subsamples of different sample masses in duplicate. Sample masses were in the range of 1 to 250 mg and the limitations of the detection limit for small sample masses and dead time distortions for large sample masses were investigated. (author)

  4. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB

    International Nuclear Information System (INIS)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-01-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  5. Fuel loading and homogeneity analysis of HFIR design fuel plates loaded with uranium silicide fuel

    International Nuclear Information System (INIS)

    Blumenfeld, P.E.

    1995-08-01

    Twelve nuclear reactor fuel plates were analyzed for fuel loading and fuel loading homogeneity by measuring the attenuation of a collimated X-ray beam as it passed through the plates. The plates were identical to those used by the High Flux Isotope Reactor (HFIR) but were loaded with uranium silicide rather than with HFIR's uranium oxide fuel. Systematic deviations from nominal fuel loading were observed as higher loading near the center of the plates and underloading near the radial edges. These deviations were within those allowed by HFIR specifications. The report begins with a brief background on the thermal-hydraulic uncertainty analysis for the Advanced Neutron Source (ANS) Reactor that motivated a statistical description of fuel loading and homogeneity. The body of the report addresses the homogeneity measurement techniques employed, the numerical correction required to account for a difference in fuel types, and the statistical analysis of the resulting data. This statistical analysis pertains to local variation in fuel loading, as well as to ''hot segment'' analysis of narrow axial regions along the plate and ''hot streak'' analysis, the cumulative effect of hot segment loading variation. The data for all twelve plates were compiled and divided into 20 regions for analysis, with each region represented by a mean and a standard deviation to report percent deviation from nominal fuel loading. The central regions of the plates showed mean values of about +3% deviation, while the edge regions showed mean values of about -7% deviation. The data within these regions roughly approximated random samplings from normal distributions, although the chi-square (χ 2 ) test for goodness of fit to normal distributions was not satisfied

  6. Numerical analysis of MHD Carreau fluid flow over a stretching cylinder with homogenous-heterogeneous reactions

    Science.gov (United States)

    Khan, Imad; Ullah, Shafquat; Malik, M. Y.; Hussain, Arif

    2018-06-01

    The current analysis concentrates on the numerical solution of MHD Carreau fluid flow over a stretching cylinder under the influences of homogeneous-heterogeneous reactions. Modelled non-linear partial differential equations are converted into ordinary differential equations by using suitable transformations. The resulting system of equations is solved with the aid of shooting algorithm supported by fifth order Runge-Kutta integration scheme. The impact of non-dimensional governing parameters on the velocity, temperature, skin friction coefficient and local Nusselt number are comprehensively delineated with the help of graphs and tables.

  7. Plasmon analysis and homogenization in plane layered photonic crystals and hyperbolic metamaterials

    Energy Technology Data Exchange (ETDEWEB)

    Davidovich, M. V., E-mail: davidovichmv@info.sgu.ru [Saratov State University (Russian Federation)

    2016-12-15

    Dispersion equations are obtained and analysis and homogenization are carried out in periodic and quasiperiodic plane layered structures consisting of alternating dielectric layers, metal and dielectric layers, as well as graphene sheets and dielectric (SiO{sub 2}) layers. Situations are considered when these structures acquire the properties of hyperbolic metamaterials (HMMs), i.e., materials the real parts of whose effective permittivity tensor have opposite signs. It is shown that the application of solely dielectric layers is more promising in the context of reducing losses.

  8. Verification of spectrophotometric method for nitrate analysis in water samples

    Science.gov (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu

    2017-12-01

    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  9. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  10. A virtual-accelerator-based verification of a Monte Carlo dose calculation algorithm for electron beam treatment planning in homogeneous phantoms

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2006-01-01

    By introducing Monte Carlo (MC) techniques to the verification procedure of dose calculation algorithms in treatment planning systems (TPSs), problems associated with conventional measurements can be avoided and properties that are considered unmeasurable can be studied. The aim of the study is to implement a virtual accelerator, based on MC simulations, to evaluate the performance of a dose calculation algorithm for electron beams in a commercial TPS. The TPS algorithm is MC based and the virtual accelerator is used to study the accuracy of the algorithm in water phantoms. The basic test of the implementation of the virtual accelerator is successful for 6 and 12 MeV (γ < 1.0, 0.02 Gy/2 mm). For 18 MeV, there are problems in the profile data for some of the applicators, where the TPS underestimates the dose. For fields equipped with patient-specific inserts, the agreement is generally good. The exception is 6 MeV where there are slightly larger deviations. The concept of the virtual accelerator is shown to be feasible and has the potential to be a powerful tool for vendors and users

  11. Solution XAS Analysis for Exploring the Active Species in Homogeneous Vanadium Complex Catalysis

    Science.gov (United States)

    Nomura, Kotohiro; Mitsudome, Takato; Tsutsumi, Ken; Yamazoe, Seiji

    2018-06-01

    Selected examples in V K-edge X-ray Absorption Near Edge Structure (XANES) analysis of a series of vanadium complexes containing imido ligands (possessing metal-nitrogen double bond) in toluene solution have been introduced, and their pre-edge and the edge were affected by their structures and nature of ligands. Selected results in exploring the oxidation states of the active species in ethylene dimerization/polymerization using homogeneous vanadium catalysts [consisting of (imido)vanadium(V) complexes and Al cocatalysts] by X-ray absorption spectroscopy (XAS) analyses have been introduced. It has been demonstrated that the method should provide more clear information concerning the active species in situ, especially by combination with the other methods (NMR and ESR spectra, X-ray crystallographic analysis, and reaction chemistry), and should be powerful tool for study of catalysis mechanism as well as for the structural analysis in solution.

  12. International Conference on Geometric and Harmonic Analysis on Homogeneous Spaces and Applications

    CERN Document Server

    Nomura, Takaaki

    2017-01-01

    This book provides the latest competing research results on non-commutative harmonic analysis on homogeneous spaces with many applications. It also includes the most recent developments on other areas of mathematics including algebra and geometry. Lie group representation theory and harmonic analysis on Lie groups and on their homogeneous spaces form a significant and important area of mathematical research. These areas are interrelated with various other mathematical fields such as number theory, algebraic geometry, differential geometry, operator algebra, partial differential equations and mathematical physics.  Keeping up with the fast development of this exciting area of research, Ali Baklouti (University of Sfax) and Takaaki Nomura (Kyushu University) launched a series of seminars on the topic, the first of which took place on November 2009 in Kerkennah Islands, the second in Sousse  on December 2011, and the third in Hammamet& nbsp;on December 2013. The last seminar, which took place on Dece...

  13. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    International Nuclear Information System (INIS)

    Vanhanen, R.

    2015-01-01

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of 16 O is problematic due to lack of correlation between total and elastic reactions

  14. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Energy Technology Data Exchange (ETDEWEB)

    Vanhanen, R., E-mail: risto.vanhanen@aalto.fi

    2015-03-15

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of {sup 16}O is problematic due to lack of correlation between total and elastic reactions.

  15. Verification of temporal-causal network models by mathematical analysis

    Directory of Open Access Journals (Sweden)

    Jan Treur

    2016-04-01

    Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.

  16. An analysis of the Rose's shim method for improvement of magnetic field homogeneity

    International Nuclear Information System (INIS)

    Ban, Etsuo

    1981-01-01

    Well known Rose's method has been applied to the magnets requiring high homogeneity (e.g. for magnetic resonance). The analysis of the Rose's shim is based on the conformal representation, and it is applicable to the poles of any form obtained by the combination of polygons. It provides rims for the magnetic poles of 90 deg edges. In this paper, the solution is determined by the elliptic function to give the magnetic field at any point in the space, directly integrating by the Schwarz-Christoffel transformation, instead of the approximate numerical integration employed by Rose, and compared with the example having applied it to a cylindrical pole. For the conditions of Rose's optimum correction, the exact solution is given as the case that the parameters of Jacobi's third kind elliptic function are equal to a half of first kind perfect elliptic integral. Since Rose depended on the approximate numerical integration, Rose's diagram showed a little insufficient correction. It was found that the pole shape giving excess correction of 10 -4 or so produced a good result for the cylindrical magnetic pole having the ratio of pole diameter to gap length of 2.5. In order to obtain the correction by which the change in homogeneity is small up to considerably intense field, the pole edges are required to be of curved surfaces. (Wakatsuki, Y.)

  17. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    Energy Technology Data Exchange (ETDEWEB)

    Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)

    2013-02-15

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  18. An homogeneization method applied to the seismic analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Brochard, D.; Hammami, L.

    1991-01-01

    Important structures like nuclear reactor cores, steam generator bundle, are schematically composed by a great number of beams, immersed in a fluid. The fluid structure interaction is an important phenomenon influencing the dynamical response of bundle. The study of this interaction through classical methods would need a refined modelisation at the scale of the beams and lead to important size of problems. The homogeneization method constitutes an alternative approach if we are mainly interested by the global behaviour of the bundle. Similar approaches have been already used for other types of industrial structures (Sanchez-Palencia 1980, Bergman and al. 1985, Theodory 1984, Benner and al. 1981). This method consists in replacing the physical heterogeneous medium by an homogeneous medium, which characteristics are determined from the resolution of a set problems on the elementary cell. In the first part of this paper the main assumptions of the method will be summarized. Moreover, other important phenomena may contribute to the dynamical behaviour of the industrial above mentioned structures: those are the impacts between the beams. These impacts could be due to supports limiting the displacements of the beams or to differences in the vibratory characteristics of the various beams. The second part of the paper will concern the way of taking into account the impacts in the linear hemogeneous formalism. Finally an application to the seismic analysis of the FBR core mock-up RAPSODIE will be presented

  19. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    International Nuclear Information System (INIS)

    Durán-Lobato, Matilde; Enguix-González, Alicia; Fernández-Arévalo, Mercedes; Martín-Banderas, Lucía

    2013-01-01

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 μm, negative zeta potential under –30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R L/S ) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R L/S , while the number of passes applied mainly determined polydispersion. α-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  20. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  1. Rate transient analysis for homogeneous and heterogeneous gas reservoirs using the TDS technique

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Sanchez, Jairo Andres; Cantillo, Jose Humberto

    2008-01-01

    In this study pressure test analysis in wells flowing under constant wellbore flowing pressure for homogeneous and naturally fractured gas reservoir using the TDS technique is introduced. Although, constant rate production is assumed in the development of the conventional well test analysis methods, constant pressure production conditions are sometimes used in the oil and gas industry. The constant pressure technique or rate transient analysis is more popular reckoned as decline curve analysis under which rate is allows to decline instead of wellbore pressure. The TDS technique, everyday more used even in the most recognized software packages although without using its trade brand name, uses the log-log plot to analyze pressure and pressure derivative test data to identify unique features from which exact analytical expression are derived to easily estimate reservoir and well parameters. For this case, the fingerprint characteristics from the log-log plot of the reciprocal rate and reciprocal rate derivative were employed to obtain the analytical expressions used for the interpretation analysis. Many simulation experiments demonstrate the accuracy of the new method. Synthetic examples are shown to verify the effectiveness of the proposed methodology

  2. Research on reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Ye Zhimin; Zhang Peng

    2014-01-01

    In order to meet the demand of nuclear energy market in the future, many new concepts of nuclear energy systems has been put forward. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multigroup cross section libraries. Due to its strong geometry modeling capability and the application of continuous energy cross section libraries, the Monte Carlo method has been widely used in reactor physics calculations, and more and more researches on Monte Carlo method has been carried out. Neutronics-thermal hydraulics coupling analysis based on Monte Carlo method has been realized. However, it still faces the problems of long computation time and slow convergence which make it not applicable to the reactor core fuel management simulations. Drawn from the deterministic core analysis method, a new two-step core analysis scheme is proposed in this work. Firstly, Monte Carlo simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Secondly, the core diffusion calculations can be done with these multigroup cross sections. The new scheme can achieve high efficiency while maintain acceptable precision, so it can be used as an effective tool for the design and analysis of innovative nuclear energy systems. Numeric tests have been done in this work to verify the new scheme. (authors)

  3. Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor

    International Nuclear Information System (INIS)

    Yulianti, Yanti; Su'ud, Zaki; Waris, Abdul; Khotimah, S. N.; Shafii, M. Ali

    2010-01-01

    The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

  4. Non-homogeneous harmonic analysis: 16 years of development

    International Nuclear Information System (INIS)

    Volberg, A L; Èiderman, V Ya

    2013-01-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles

  5. Analysis of forced convective transient boiling by homogeneous model of two-phase flow

    International Nuclear Information System (INIS)

    Kataoka, Isao

    1985-01-01

    Transient forced convective boiling is of practical importance in relation to the accident analysis of nuclear reactor etc. For large length-to-diameter ratio, the transient boiling characteristics are predicted by transient two-phase flow calculations. Based on homogeneous model of two-phase flow, the transient forced convective boiling for power and flow transients are analysed. Analytical expressions of various parameters of transient two-phase flow have been obtained for several simple cases of power and flow transients. Based on these results, heat flux, velocity and time at transient CHF condition are predicted analytically for step and exponential power increases, and step, exponential and linear velocity decreases. The effects of various parameters on heat flux, velocity and time at transient CHF condition have been clarified. Numerical approach combined with analytical method is proposed for more complicated cases. Solution method for pressure transient are also described. (author)

  6. Tempered Water Lower Port Connector Structural Analysis Verification

    International Nuclear Information System (INIS)

    CREA, B.A.

    2000-01-01

    Structural analysis of the lower port connection of the Tempered Water System of the Cold Vacuum Drying Facility was performed. Subsequent detailed design changes to enhance operability resulted in the need to re-evaluate the bases of the original analysis to verify its continued validity. This evaluation is contained in Appendix A of this report. The original evaluation is contained in Appendix B

  7. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  8. Taiwan Power Company's power distribution analysis and fuel thermal margin verification methods for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, P.H.

    1995-01-01

    Taiwan Power Company's (TPC's) power distribution analysis and fuel thermal margin verification methods for pressurized water reactors (PWRs) are examined. The TPC and the Institute of Nuclear Energy Research started a joint 5-yr project in 1989 to establish independent capabilities to perform reload design and transient analysis utilizing state-of-the-art computer programs. As part of the effort, these methods were developed to allow TPC to independently perform verifications of the local power density and departure from nucleate boiling design bases, which are required by the reload safety evaluation for the Maanshan PWR plant. The computer codes utilized were extensively validated for the intended applications. Sample calculations were performed for up to six reload cycles of the Maanshan plant, and the results were found to be quite consistent with the vendor's calculational results

  9. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  10. Numerical analysis for Darcy-Forchheimer flow in presence of homogeneous-heterogeneous reactions

    Directory of Open Access Journals (Sweden)

    Muhammad Ijaz Khan

    Full Text Available A mathematical study is presented to investigate the influences of homogeneous and heterogeneous reactions in local similar flow caused by stretching sheet with a non-linear velocity and variable thickness. Porous medium effects are characterized by using Darcy-Forchheimer porous-media. A simple isothermal model of homogeneous-heterogeneous reactions is used. The multiphysical boundary value problem is dictated by ten thermophysical parameters: ratio of mass diffusion coefficients, Prandtl number, local inertia coefficient parameter, inverse Darcy number, shape parameter, surface thickness parameter, Hartman number, Homogeneous heat reaction, strength of homogeneous-heterogeneous reactions and Schmidt number. Resulting systems are computed by Runge-Kutta-Fehlberg method. Different shapes of velocity are noticed for n > 1 and n < 1. Keywords: Homogeneous-heterogeneous reactions, Non Darcy porous medium, Variable sheet thickness, Homogeneous heat reaction with stoichiometric coefficient, Runge-Kutta-Fehlberg method

  11. Cumulative BRCA mutation analysis in the Greek population confirms that homogenous ethnic background facilitates genetic testing.

    Science.gov (United States)

    Tsigginou, Alexandra; Vlachopoulos, Fotios; Arzimanoglou, Iordanis; Zagouri, Flora; Dimitrakakis, Constantine

    2015-01-01

    Screening for BRCA 1 and BRCA 2 mutations has long moved from the research lab to the clinic as a routine clinical genetic testing. BRCA molecular alteration pattern varies among ethnic groups which makes it already a less straightforward process to select the appropriate mutations for routine genetic testing on the basis of known clinical significance. The present report comprises an in depth literature review of the so far reported BRCA 1 and BRCA 2 molecular alterations in Greek families. Our analysis of Greek cumulative BRCA 1 and 2 molecular data, produced by several independent groups, confirmed that six recurrent deleterious mutations account for almost 60 % and 70 % of all BRCA 1 and 2 and BRCA 1 mutations, respectively. As a result, it makes more sense to perform BRCA mutation analysis in the clinic in two sequential steps, first conventional analysis for the six most prevalent pathogenic mutations and if none identified, a second step of New Generation Sequencing-based whole genome or whole exome sequencing would follow. Our suggested approach would enable more clinically meaningful, considerably easier and less expensive BRCA analysis in the Greek population which is considered homogenous.

  12. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  13. Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification

    OpenAIRE

    Zheng Hui-Fan; Fan Xiao-Wei; Wang Fang; Liang Yao-Hua

    2013-01-01

    A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good a...

  14. Analysis of Nerve Agent Metabolites from Hair for Long-Term Verification of Nerve Agent Exposure

    Science.gov (United States)

    2016-05-09

    rats. The exposed hair samples were received from USAMRICD early in method development and required storage until the method was developed and validated...Because the storage of hair samples after an exposure has not been studied, it was unclear as to whether the analyte would be stable in the stored...biological matrixes typically used for analysis (i.e., blood, urine , and tissues), limiting the amount of time after an exposure that verification is

  15. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    International Nuclear Information System (INIS)

    BRATZEL, D.R.

    2000-01-01

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks

  16. Long term spatial and temporal rainfall trends and homogeneity analysis in Wainganga basin, Central India

    Directory of Open Access Journals (Sweden)

    Arun Kumar Taxak

    2014-08-01

    Full Text Available Gridded rainfall data of 0.5×0.5° resolution (CRU TS 3.21 was analysed to study long term spatial and temporal trends on annual and seasonal scales in Wainganga river basin located in Central India during 1901–2012. After testing the presence of autocorrelation, Mann–Kendall (Modified Mann–Kendall test was applied to non-auto correlated (auto correlated series to detect the trends in rainfall data. Theil and Sen׳s slope estimator test was used for finding the magnitude of change over a time period. For detecting the most probable change year, Pettitt–Mann–Whitney test was applied. The Rainfall series was then divided into two partial duration series for finding changes in trends before and after the change year. Arc GIS was used to explore spatial patterns of the trends over the entire basin. Though most of the grid points shows a decreasing trend in annual rainfall, only seven grids has a significant decreasing trend during 1901–2012. On the basis of seasonal trend analysis, non-significant increasing trend is observed only in post monsoon season while seven grid points show significant decreasing trend in monsoon rainfall and non-significant in pre-monsoon and winter rainfall over the last 112 years. During the study period, overall a 8.45% decrease in annual rainfall is estimated. The most probable year of change was found to be 1948 in annual and monsoonal rainfall. There is an increasing rainfall trend in the basin during the period 1901–1948, which is reversed during the period 1949–2012 resulting in decreasing rainfall trend in the basin. Homogeneous trends in annual and seasonal rainfall over a grid points is exhibited in the basin by van Belle and Hughes׳ homogeneity trend test.

  17. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  18. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    Science.gov (United States)

    Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng

    2012-12-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.

  19. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    International Nuclear Information System (INIS)

    Zhang Tie-Yan; Zhao Yan; Xie Xiang-Peng

    2012-01-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach. (general)

  20. Numerical verification of composite rods theory on multi-story buildings analysis

    Science.gov (United States)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  1. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  2. Slideline verification for multilayer pressure vessel and piping analysis

    International Nuclear Information System (INIS)

    Van Gulick, L.A.

    1983-01-01

    Nonlinear finite element method (FEM) computer codes with slideline algorithm implementations should be useful for the analysis of prestressed multilayer pressure vessels and piping. This paper presents closed form solutions useful for validating slideline implementations for this purpose. The solutions describe stresses and displacements of an internally pressurized elastic-plastic sphere initially separated from an elastic outer sphere by a uniform gap. Comparison of closed form and FEM results evaluates the usefulness of the closed form solution and the validity of the slideline implementation used

  3. Using harmonical analysis for experimental verification of reactor dynamics

    International Nuclear Information System (INIS)

    Hrstka, V.

    1974-01-01

    The questions are discussed of the accuracy of the method of static programming when applied to digital harmonic analysis, with regard to the variation of the mean value of the analyzed signals, and to the use of symmetrical trapezoidal periodical signals. The evaluation is made of the suitability of the above-mentioned method in determining the frequency characteristic of the SR-OA reactor. The results obtained were applied to planning the start-up experiments of the KS-150 reactor at the A-1 nuclear power station. (author)

  4. Dynamics of railway bridges, analysis and verification by field tests

    Directory of Open Access Journals (Sweden)

    Andersson Andreas

    2015-01-01

    Full Text Available The following paper discusses different aspects of railway bridge dynamics, comprising analysis, modelling procedures and experimental testing. The importance of realistic models is discussed, especially regarding boundary conditions, load distribution and soil-structure interaction. Two theoretical case studies are presented, involving both deterministic and probabilistic assessment of a large number of railway bridges using simplified and computationally efficient models. A total of four experimental case studies are also introduced, illustrating different aspects and phenomena in bridge dynamics. The excitation consists of both ambient vibrations, train induced vibrations, free vibrations after train passages and controlled forced excitation.

  5. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  6. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  7. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Directory of Open Access Journals (Sweden)

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  8. Homogenization on Multi-Materials’ Elements: Application to Printed Circuit Boards and Warpage Analysis

    Directory of Open Access Journals (Sweden)

    Araújo Manuel

    2016-01-01

    Full Text Available Multi-material domains are often found in industrial applications. Modelling them can be computationally very expensive due to meshing requirements. The finite element properties comprising different materials are hardly accurate. In this work, a new homogenization method that simplifies the computation of the homogenized Young modulus, Poisson ratio and thermal expansion coefficient is proposed, and applied to composite-like material on a printed circuit board. The results show a good properties correspondence between the homogenized domain and the real geometry simulation.

  9. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  10. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  11. Poster - 43: Analysis of SBRT and SRS dose verification results using the Octavius 1000SRS detector

    Energy Technology Data Exchange (ETDEWEB)

    Cherpak, Amanda [Nova Scotia Cancer Centre, Nova Scotia Health Authority, Halifax, NS, Department of Radiation Oncology, Dalhousie University, Halifax, NS, Department of Physics and Atmospheric Sciences, Dalhousie University, Halifax, NS (Canada)

    2016-08-15

    Purpose: The Octavius 1000{sup SRS} detector was commissioned in December 2014 and is used routinely for verification of all SRS and SBRT plans. Results of verifications were analyzed to assess trends and limitations of the device and planning methods. Methods: Plans were delivered using a True Beam STx and results were evaluated using gamma analysis (95%, 3%/3mm) and absolute dose difference (5%). Verification results were analyzed based on several plan parameters including tumour volume, degree of modulation and prescribed dose. Results: During a 12 month period, a total of 124 patient plans were verified using the Octavius detector. Thirteen plans failed the gamma criteria, while 7 plans failed based on the absolute dose difference. When binned according to degree of modulation, a significant correlation was found between MU/cGy and both mean dose difference (r=0.78, p<0.05) and gamma (r=−0.60, p<0.05). When data was binned according to tumour volume, the standard deviation of average gamma dropped from 2.2% – 3.7% for the volumes less than 30 cm{sup 3} to below 1% for volumes greater than 30 cm{sup 3}. Conclusions: The majority of plans and verification failures involved tumour volumes smaller than 30 cm{sup 3}. This was expected due to the nature of disease treated with SBRT and SRS techniques and did not increase rate of failure. Correlations found with MU/cGy indicate that as modulation increased, results deteriorated but not beyond the previously set thresholds.

  12. Variability of apparently homogeneous soilscapes in São Paulo state, Brazil: I. spatial analysis

    Directory of Open Access Journals (Sweden)

    M. van Den Berg

    2000-06-01

    Full Text Available The spatial variability of strongly weathered soils under sugarcane and soybean/wheat rotation was quantitatively assessed on 33 fields in two regions in São Paulo State, Brazil: Araras (15 fields with sugarcane and Assis (11 fields with sugarcane and seven fields with soybean/wheat rotation. Statistical methods used were: nested analysis of variance (for 11 fields, semivariance analysis and analysis of variance within and between fields. Spatial levels from 50 m to several km were analyzed. Results are discussed with reference to a previously published study carried out in the surroundings of Passo Fundo (RS. Similar variability patterns were found for clay content, organic C content and cation exchange capacity. The fields studied are quite homogeneous with respect to these relatively stable soil characteristics. Spatial variability of other characteristics (resin extractable P, pH, base- and Al-saturation and also soil colour, varies with region and, or land use management. Soil management for sugarcane seems to have induced modifications to greater depths than for soybean/wheat rotation. Surface layers of soils under soybean/wheat present relatively little variation, apparently as a result of very intensive soil management. The major part of within-field variation occurs at short distances (< 50 m in all study areas. Hence, little extra information would be gained by increasing sampling density from, say, 1/km² to 1/50 m². For many purposes, the soils in the study regions can be mapped with the same observation density, but residual variance will not be the same in all areas. Bulk sampling may help to reveal spatial patterns between 50 and 1.000 m.

  13. Autoregressive harmonic analysis of the earth's polar motion using homogeneous international latitude service data

    Science.gov (United States)

    Fong Chao, B.

    1983-12-01

    The homogeneous set of 80-year-long (1900-1979) International Latitude Service (ILS) polar motion data is analyzed using the autoregressive method (Chao and Gilbert, 1980) which resolves and produces estimates for the complex frequency (or frequency and Q) and complex amplitude (or amplitude and phase) of each harmonic component in the data. Principal conclusion of this analysis are that (1) the ILS data support the multiple-component hypothesis of the Chandler wobble (it is found that the Chandler wobble can be adequately modeled as a linear combination of four (coherent) harmonic components, each of which represents a steady, nearly circular, prograte motion, a behavior that is inconsistent with the hypothesis of a single Chandler period excited in a temporally and/or spatially random fashion). (2) the four-component Chandler wobble model ``explains'' the apparent phase reversal during 1920-1940 and the pre-1950 empirical period-amplitude relation, (3) the annual wobble is shown to be rather stationary over the years both in amplitude and in phase and no evidence is found to support the large variations reported by earlier investigations. (4) the Markowitz wobble is found to support the large variations reported by earlier investigations. (4) the Markowitz wobble is found to be marginally retrograde and appears to have a complicated behavior which cannot be resolved because of the shortness of the data set.

  14. Classical big-bounce cosmology: dynamical analysis of a homogeneous and irrotational Weyssenhoff fluid

    International Nuclear Information System (INIS)

    Brechet, S D; Hobson, M P; Lasenby, A N

    2008-01-01

    A dynamical analysis of an effective homogeneous and irrotational Weyssenhoff fluid in general relativity is performed using the 1 + 3 covariant approach that enables the dynamics of the fluid to be determined without assuming any particular form for the spacetime metric. The spin contributions to the field equations produce a bounce that averts an initial singularity, provided that the spin density exceeds the rate of shear. At later times, when the spin contribution can be neglected, a Weyssenhoff fluid reduces to a standard cosmological fluid in general relativity. Numerical solutions for the time evolution of the generalized scale factor R(t) in spatially curved models are presented, some of which exhibit eternal oscillatory behaviour without any singularities. In spatially flat models, analytical solutions for particular values of the equation-of-state parameter are derived. Although the scale factor of a Weyssenhoff fluid generically has a positive temporal curvature near a bounce, it requires unreasonable fine tuning of the equation-of-state parameter to produce a sufficiently extended period of inflation to fit the current observational data

  15. Modified Truncated Multiplicity Analysis to Improve Verification of Uranium Fuel Cycle Materials

    International Nuclear Information System (INIS)

    LaFleur, A.; Miller, K.; Swinhoe, M.; Belian, A.; Croft, S.

    2015-01-01

    Accurate verification of 235U enrichment and mass in UF6 storage cylinders and the UO2F2 holdup contained in the process equipment is needed to improve international safeguards and nuclear material accountancy at uranium enrichment plants. Small UF6 cylinders (1.5'' and 5'' diameter) are used to store the full range of enrichments from depleted to highly-enriched UF6. For independent verification of these materials, it is essential that the 235U mass and enrichment measurements do not rely on facility operator declarations. Furthermore, in order to be deployed by IAEA inspectors to detect undeclared activities (e.g., during complementary access), it is also imperative that the measurement technique is quick, portable, and sensitive to a broad range of 235U masses. Truncated multiplicity analysis is a technique that reduces the variance in the measured count rates by only considering moments 1, 2, and 3 of the multiplicity distribution. This is especially important for reducing the uncertainty in the measured doubles and triples rates in environments with a high cosmic ray background relative to the uranium signal strength. However, we believe that the existing truncated multiplicity analysis throws away too much useful data by truncating the distribution after the third moment. This paper describes a modified truncated multiplicity analysis method that determines the optimal moment to truncate the multiplicity distribution based on the measured data. Experimental measurements of small UF6 cylinders and UO2F2 working reference materials were performed at Los Alamos National Laboratory (LANL). The data were analyzed using traditional and modified truncated multiplicity analysis to determine the optimal moment to truncate the multiplicity distribution to minimize the uncertainty in the measured count rates. The results from this analysis directly support nuclear safeguards at enrichment plants and provide a more accurate verification method for UF6

  16. Fast and Safe Concrete Code Execution for Reinforcing Static Analysis and Verification

    Directory of Open Access Journals (Sweden)

    M. Belyaev

    2015-01-01

    Full Text Available The problem of improving precision of static analysis and verification techniques for C is hard due to simplification assumptions these techniques make about the code model. We present a novel approach to improving precision by executing the code model in a controlled environment that captures program errors and contract violations in a memory and time efficient way. We implemented this approach as an executor module Tassadar as a part of bounded model checker Borealis. We tested Tassadar on two test sets, showing that its impact on performance of Borealis is minimal.The article is published in the authors’ wording.

  17. PIPE STRESS and VERPIP codes for stress analysis and verifications of PEC reactor piping

    International Nuclear Information System (INIS)

    Cesari, F.; Ferranti, P.; Gasparrini, M.; Labanti, L.

    1975-01-01

    To design LMFBR piping systems following ASME Sct. III requirements unusual flexibility computer codes are to be adopted to consider piping and its guard-tube. For this purpose PIPE STRESS code previously prepared by Southern-Service, has been modified. Some subroutine for detailed stress analysis and principal stress calculations on all the sections of piping have been written and fitted in the code. Plotter can also be used. VERPIP code for automatic verifications of piping as class 1 Sct. III prescriptions has been also prepared. The results of PIPE STRESS and VERPIP codes application to PEC piping are in section III of this report

  18. Improvement and verification of fast-reactor safety-analysis techniques. Final report

    International Nuclear Information System (INIS)

    Barker, D.H.

    1981-12-01

    The work involved on this project took place between March 1, 1975 and December 31, 1981. The work resulted in two PhD and one Masters Theses. Part I was the Verification and Applicability Studies for the VENUS-II LMFBR Disassembly Code. These tests showed that the VENUS-II code closely predicted the energy release in all three tests chosen for analysis. Part II involved the chemical simulation of pool dispersion in the transition phase of an HCDA. Part III involved the reaction of an internally heated fluid and the vessel walls

  19. Analysis of zinc oxide nanoparticles binding proteins in rat blood and brain homogenate

    Directory of Open Access Journals (Sweden)

    Shim KH

    2014-12-01

    Full Text Available Kyu Hwan Shim,1 John Hulme,1 Eun Ho Maeng,2 Meyoung-Kon Kim,3 Seong Soo A An1 1Department of Bionano Technology, Gachon Medical Research Institute, Gachon University, Sungnam-si, Gyeonggi-do, South Korea; 2Department of Analysis, KTR, Kimpo, Gyeonggi-do, South Korea; 3Department of Biochemistry and Molecular Biology, Korea University Medical School and College, Seoul, South Korea Abstract: Nanoparticles (NPs are currently used in chemical, cosmetic, pharmaceutical, and electronic products. Nevertheless, limited safety information is available for many NPs, especially in terms of their interactions with various binding proteins, leading to potential toxic effects. Zinc oxide (ZnO NPs are included in the formulation of new products, such as adhesives, batteries, ceramics, cosmetics, cement, glass, ointments, paints, pigments, and supplementary foods, resulting in increased human exposures to ZnO. Hence, we investigated the potential ZnO nanotoxic pathways by analyzing the adsorbed proteins, called protein corona, from blood and brain from four ZnO NPs, ZnOSM20(-, ZnOSM20(+, ZnOAE100(-, and ZnOAE100(+, in order to understand their potential mechanisms in vivo. Through this study, liquid chromatography–mass spectroscopy/mass spectroscopy technology was employed to identify all bound proteins. Totals of 52 and 58 plasma proteins were identified as being bound to ZnOSM20(- and ZnOSM20(+, respectively. For ZnOAE100(- and ZnOAE100(+, 58 and 44 proteins were bound, respectively. Similar numbers of proteins were adsorbed onto ZnO irrespective of size or surface charge of the nanoparticle. These proteins were further analyzed with ClueGO, a Cytoscape plugin, which provided gene ontology and the biological interaction processes of identified proteins. Interactions between diverse proteins and ZnO nanoparticles could result in an alteration of their functions, conformation, and clearance, eventually affecting many biological processes. Keywords: brain

  20. Altered spontaneous brain activity in adolescent boys with pure conduct disorder revealed by regional homogeneity analysis.

    Science.gov (United States)

    Wu, Qiong; Zhang, Xiaocui; Dong, Daifeng; Wang, Xiang; Yao, Shuqiao

    2017-07-01

    Functional magnetic resonance imaging (fMRI) studies have revealed abnormal neural activity in several brain regions of adolescents with conduct disorder (CD) performing various tasks. However, little is known about the spontaneous neural activity in people with CD in a resting state. The aims of this study were to investigate CD-associated regional activity abnormalities and to explore the relationship between behavioral impulsivity and regional activity abnormalities. Resting-state fMRI (rs-fMRI) scans were administered to 28 adolescents with CD and 28 age-, gender-, and IQ-matched healthy controls (HCs). The rs-fMRI data were subjected to regional homogeneity (ReHo) analysis. ReHo can demonstrate the temporal synchrony of regional blood oxygen level-dependent signals and reflect the coordination of local neuronal activity facilitating similar goals or representations. Compared to HCs, the CD group showed increased ReHo bilaterally in the insula as well as decreased ReHo in the right inferior parietal lobule, right middle temporal gyrus and right fusiform gyrus, left anterior cerebellum anterior, and right posterior cerebellum. In the CD group, mean ReHo values in the left and the right insula correlated positively with Barratt Impulsivity Scale (BIS) total scores. The results suggest that CD is associated with abnormal intrinsic brain activity, mainly in the cerebellum and temporal-parietal-limbic cortices, regions that are related to emotional and cognitive processing. BIS scores in adolescents with CD may reflect severity of abnormal neuronal synchronization in the insula.

  1. Homogenization analysis of invasion dynamics in heterogeneous landscapes with differential bias and motility.

    Science.gov (United States)

    Yurk, Brian P

    2018-07-01

    Animal movement behaviors vary spatially in response to environmental heterogeneity. An important problem in spatial ecology is to determine how large-scale population growth and dispersal patterns emerge within highly variable landscapes. We apply the method of homogenization to study the large-scale behavior of a reaction-diffusion-advection model of population growth and dispersal. Our model includes small-scale variation in the directed and random components of movement and growth rates, as well as large-scale drift. Using the homogenized model we derive simple approximate formulas for persistence conditions and asymptotic invasion speeds, which are interpreted in terms of residence index. The homogenization results show good agreement with numerical solutions for environments with a high degree of fragmentation, both with and without periodicity at the fast scale. The simplicity of the formulas, and their connection to residence index make them appealing for studying the large-scale effects of a variety of small-scale movement behaviors.

  2. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    International Nuclear Information System (INIS)

    Maruyama, Soh; Fujimoto, Nozomu; Sudo, Yukio; Kiso, Yoshihiro; Murakami, Tomoyuki.

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T 1-M ) with simulated fuel rods and fuel blocks. (author)

  3. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    Science.gov (United States)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  4. A computational analysis on homogeneous-heterogeneous mechanism in Carreau fluid flow

    Science.gov (United States)

    Khan, Imad; Rehman, Khalil Ur; Malik, M. Y.; Shafquatullah

    2018-03-01

    In this article magnetohydrodynamic Carreau fluid flow towards stretching cylinder is considered in the presence of homogeneous-heterogeneous reactions effect. The flow model is structured by utilizing theoretical grounds. For the numerical solution a shooting method along with Runge-Kutta algorithm is executed. The outcomes are provided through graphs. It is observed that the Carreau fluid concentration shows decline values via positive iterations of homogeneous-heterogeneous reaction parameters towards both shear thinning and thickening case. The present work is certified through comparison with already existing literature in a limiting sense.

  5. Analysis of a homogenous and heterogeneous stylized half core of a CANDU reactor

    Energy Technology Data Exchange (ETDEWEB)

    EL-Khawlani, Afrah [Physics Department, Sana' a (Yemen); Aziz, Moustafa [Nuclear and radiological regulatory authority, Cairo (Egypt); Ismail, Mahmud Yehia; Ellithi, Ali Yehia [Cairo Univ. (Egypt). Faculty of Science

    2015-03-15

    The MCNPX (Monte Carlo N-Particle Transport Code System) code has been used for modeling and simulation of a half core of CANDU (CANada Deuterium-Uranium) reactor, both homogenous and heterogeneous model for the reactor core are designed. The fuel is burnt in normal operation conditions of CANDU reactors. Natural uranium fuel is used in the model. The multiplication factor for homogeneous and heterogeneous reactor core is calculated and compared during fuel burnup. The concentration of both uranium and plutonium isotopes are analysed in the model. The flux and power distributions through channels are calculated.

  6. A four-scale homogenization analysis of creep of a nuclear containment structure

    Energy Technology Data Exchange (ETDEWEB)

    Tran, A.B. [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); EDF R and D – Département MMC Site des Renardières – Avenue des Renardières - Ecuelles, 77818 Moret sur Loing Cedex (France); Department of Applied Informatics in Construction, National University of Civil Engineering, 55 Giai Phong Road, Hai Ba Trung District, Hanoi (Viet Nam); Yvonnet, J., E-mail: julien.yvonnet@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); He, Q.-C. [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); Toulemonde, C.; Sanahuja, J. [EDF R and D – Département MMC Site des Renardières – Avenue des Renardières - Ecuelles, 77818 Moret sur Loing Cedex (France)

    2013-12-15

    A four-scale approach is proposed to predict the creep behavior of a concrete structure. The behavior of concrete is modeled through a numerical multiscale methodology, by successively homogenizing the viscoelastic behavior at different scales, starting from the cement paste. The homogenization is carried out by numerically constructing an effective relaxation tensor at each scale. In this framework, the impact of modifying the microstructural parameters can be directly observed on the structure response, like the interaction of the creep of concrete with the prestressing tendons network, and the effects of an internal pressure which might occur during a nuclear accident.

  7. A four-scale homogenization analysis of creep of a nuclear containment structure

    International Nuclear Information System (INIS)

    Tran, A.B.; Yvonnet, J.; He, Q.-C.; Toulemonde, C.; Sanahuja, J.

    2013-01-01

    A four-scale approach is proposed to predict the creep behavior of a concrete structure. The behavior of concrete is modeled through a numerical multiscale methodology, by successively homogenizing the viscoelastic behavior at different scales, starting from the cement paste. The homogenization is carried out by numerically constructing an effective relaxation tensor at each scale. In this framework, the impact of modifying the microstructural parameters can be directly observed on the structure response, like the interaction of the creep of concrete with the prestressing tendons network, and the effects of an internal pressure which might occur during a nuclear accident

  8. Analysis of dryout behaviour in laterally non-homogeneous debris beds using the MEWA-2D code

    International Nuclear Information System (INIS)

    Rahman, Saidur; Buerger, Manfred; Buck, Michael; Pohlner, Georg; Kulenovic, Rudi; Nayak, Arun Kumar; Sehgal, Bal Raj

    2009-01-01

    The present study analyses the impact of lateral non-homogeneities on the coolability of heated, initially water filled debris beds. Debris beds which may be formed in a postulated severe accident in light water reactors can not be expected to have a homogeneous structure. Lateral non-homogeneities are given e.g. already by a variation in height as in a heap of debris. Internally, less porous or more porous region may occur, the latter even as downcomer-like structures are considered to favour supply of water to the bed and thus coolability. In previous work it has been shown that such non-homogeneities are often strongly enhancing coolability, as compared to earlier investigations on laterally homogeneous beds. The present contribution aims at extending the view by analysing further cases of non-homogeneities with the MEWA-2D code. Especially, effects of capillary forces are considered in contrast to earlier analysis. Part of the paper deals with specific experiments performed in the POMECO facility at KTH in which a laterally stratified debris bed has been considered, whereby especially a strong jump of porosity, from 0.26 to 0.38, has been established. Astonishingly, under top as well as bottom flooding, dryout in these experiments occurred first in the lateral layer with higher porosity. Understanding is now provided by the effect of capillary forces: water is drawn from this layer to the less porous one. This effect improves the cooling in the less porous layer while it reduces coolability of the more porous layer. No real loop behaviour of inflow via the higher porosities with subsequent upflow in the less porous layer establishes here, in contrast to expectations. Other cases (different lateral heating in an otherwise homogeneous bed, closed downcomer in a homogeneous bed and heap-like debris) show, on the other hand, strongly improved coolability by such loops establishing due to the lateral differences in void and the corresponding pressure differences

  9. Radiochemical analysis of homogeneously solidified low level radioactive waste from nuclear power plants

    International Nuclear Information System (INIS)

    Sato, Kaneaki; Ikeuchi, Yoshihiro; Higuchi, Hideo

    1995-01-01

    As mentioned above, we have reliable radioanalytical methods for all kinds of homogeneously solidified wastes. We are now under studying an analytical method for pellets which are made from evaporator concentrates or resin. And we are going to study to establish new analytical method for the rad-waste including metal, cloths and so on in near future. (J.P.N.)

  10. Reflector homogenization

    International Nuclear Information System (INIS)

    Sanchez, R.; Ragusa, J.; Santandrea, S.

    2004-01-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P 0 transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP N core calculations. (Author)

  11. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  12. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  13. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  14. Verification and implications of the multiple pin treatment in the SASSYS-1 LMR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1994-01-01

    As part of a program to obtain realistic, as opposed to excessively conservative, analysis of reactor transients, a multiple pin treatment for the analysis of intra-subassembly thermal hydraulics has been included in the SASSYS-1 liquid metal reactor systems analysis code. This new treatment has made possible a whole new level of verification for the code. The code can now predict the steady-state and transient responses of individual thermocouples within instrumented subassemlies in a reactor, rather than just predicting average temperatures for a subassembly. Very good agreement has been achieved between code predictions and the experimental measurements of steady-state and transient temperatures and flow rates in the Shutdown Heat Removal Tests in the EBR-II Reactor. Detailed multiple pin calculations for blanket subassemblies in the EBR-II reactor demonstrate that the actual steady-state and transient peak temperatures in these subassemblies are significantly lower than those that would be calculated by simpler models

  15. Analysis of SiO2 nanoparticles binding proteins in rat blood and brain homogenate

    Directory of Open Access Journals (Sweden)

    Shim KH

    2014-12-01

    Full Text Available Kyu Hwan Shim,1 John Hulme,1 Eun Ho Maeng,2 Meyoung-Kon Kim,3 Seong Soo A An1 1Department of Bionano Technology, Gachon Medical Research Institute, Gachon University, Sungnam-si, 2Department of Analysis, KTR, Kimpo, Gyeonggi-do, 3Department of Biochemistry and Molecular Biology, Korea University Medical School and College, Seoul, South Korea Abstract: A multitude of nanoparticles, such as titanium oxide (TiO2, zinc oxide, aluminum oxide, gold oxide, silver oxide, iron oxide, and silica oxide, are found in many chemical, cosmetic, pharmaceutical, and electronic products. Recently, SiO2 nanoparticles were shown to have an inert toxicity profile and no association with an irreversible toxicological change in animal models. Hence, exposure to SiO2 nanoparticles is on the increase. SiO2 nanoparticles are routinely used in numerous materials, from strengthening filler for concrete and other construction composites, to nontoxic platforms for biomedical application, such as drug delivery and theragnostics. On the other hand, recent in vitro experiments indicated that SiO2 nanoparticles were cytotoxic. Therefore, we investigated these nanoparticles to identify potentially toxic pathways by analyzing the adsorbed protein corona on the surface of SiO2 nanoparticles in the blood and brain of the rat. Four types of SiO2 nanoparticles were chosen for investigation, and the protein corona of each type was analyzed using liquid chromatography-tandem mass spectrometry technology. In total, 115 and 48 plasma proteins from the rat were identified as being bound to negatively charged 20 nm and 100 nm SiO2 nanoparticles, respectively, and 50 and 36 proteins were found for 20 nm and 100 nm arginine-coated SiO2 nanoparticles, respectively. Higher numbers of proteins were adsorbed onto the 20 nm sized SiO2 nanoparticles than onto the 100 nm sized nanoparticles regardless of charge. When proteins were compared between the two charges, higher numbers of proteins were

  16. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  17. Preliminary homogeneity study of in-house reference material using neutron activation analysis and X-ray fluorescence

    International Nuclear Information System (INIS)

    Gras, N.; Munoz, L.; Cassorla, V.; Castillo, P.

    1993-01-01

    Although many biological reference materials for quality control of trace element analysis are commercially available, there is still a need for additional local materials for special matrices. In the Latin American region a preliminary study has been commenced involving analytical strategies for the characterization of in-house reference material. A biological sample, prepared in Brazil, constitutes the first regional attempt to prepare reference material. It was analyzed by neutron activation analysis (NAA) and X-ray fluorescence (XRF) to verify its homogeneity. The determination of the trace elements and certain major elements was carried out by instrumental NAA. Trace elements such as Cd, Mn, Mo and Cu were determined using NAA with radiochemical separations to improve the sensitivity and precision. XRF was applied only to major constituents and some trace elements with concentration of more than 10 μg/g. From a total of 18 elements analyzed, only Fe, Cr and Sc were not homogeneously distributed. (orig.)

  18. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  19. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  20. Experimental verification for standard analysis procedure of 241Am in food

    International Nuclear Information System (INIS)

    Liu Qingfen; Zhu Hongda; Liu Shutian; Pan Jingshun; Yang Dating

    2005-01-01

    Objective: The briefly experimental verification for 'determination of 241 Am in food' has been described. Methods: The overall recovery, the MDL of method and decontamination experiment has been done by standard analysis procedure. Results: The overall recovery is 76.26 ± 4.1%. The MDL is 3.4 x 10 -5 Bq/g ash, decontamination factor is higher than 10 3 for Po, 10 2 for U, Th, Pu and 60 for 237 Np. Conclusion: The results showed that the overall recovery is quite high and reliable, the MDL of method is able to meet examining 241 Am limited values in foods. the obtained decontamination factors of recommended procedure can meet analysis of 241 Am in food examination. Venifying results of the procedure are satisfied by using 243 Am spike and 241 Am standard reference material. (authors)

  1. Reflector homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr

    2004-07-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)

  2. Three-dimensional single-channel thermal analysis of fully ceramic microencapsulated fuel via two-temperature homogenized model

    International Nuclear Information System (INIS)

    Lee, Yoonhee; Cho, Nam Zin

    2014-01-01

    Highlights: • Two-temperature homogenized model is applied to thermal analysis of fully ceramic microencapsulated (FCM) fuel. • Based on the results of Monte Carlo calculation, homogenized parameters are obtained. • 2-D FEM/1-D FDM hybrid method for the model is used to obtain 3-D temperature profiles. • The model provides the fuel-kernel and SiC matrix temperatures separately. • Compared to UO 2 fuel, the FCM fuel shows ∼560 K lower maximum temperatures at steady- and transient states. - Abstract: The fully ceramic microencapsulated (FCM) fuel, one of the accident tolerant fuel (ATF) concepts, consists of TRISO particles randomly dispersed in SiC matrix. This high heterogeneity in compositions leads to difficulty in explicit thermal calculation of such a fuel. For thermal analysis of a fuel element of very high temperature reactors (VHTRs) which has a similar configuration to FCM fuel, two-temperature homogenized model was recently proposed by the authors. The model was developed using particle transport Monte Carlo method for heat conduction problems. It gives more realistic temperature profiles, and provides the fuel-kernel and graphite temperatures separately. In this paper, we apply the two-temperature homogenized model to three-dimensional single-channel thermal analysis of the FCM fuel element for steady- and transient-states using 2-D FEM/1-D FDM hybrid method. In the analyses, we assume that the power distribution is uniform in radial direction at steady-state and that in axial direction it is in the form of cosine function for simplicity. As transient scenarios, we consider (i) coolant inlet temperature transient, (ii) inlet mass flow rate transient, and (iii) power transient. The results of analyses are compared to those of conventional UO 2 fuel having the same geometric dimension and operating conditions

  3. The use of an equivalent homogeneous half-space in soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Holzloehner, U.

    1979-01-01

    In analyses of seismic soil-structure interaction, the soil often is assumed as an elastic body. The solution procedure is lengthy if the heterogeneity of the soil is considered strictly. If the soil is taken as a homogeneous elastic half-space, existing solutions can be used. There are solutions for some simple layered systems, too. However, it is often not easy to correlate the variation of the soil properties with depth as found by measurements to those of ideal systems. The purpose of the paper is to show how to make use of the existing solutions. (orig.)

  4. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    International Nuclear Information System (INIS)

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  5. Homogeneity characterisation of (U,Gd)O2 sintered pellets by X-ray diffraction powder analysis applying Rietveld method

    International Nuclear Information System (INIS)

    Leyva, Ana G.; Vega, Daniel R.; Trimarco, Veronica G.; Marchi, Daniel E.

    1999-01-01

    The (U,Gd)O 2 sintered pellets are fabricated by different methods. The homogeneity characterisation of Gd content seems to be necessary as a production control to qualify the process and the final product. The micrographic technique is the most common method used to analyse the homogeneity of these samples, this method requires time and expertise to obtain good results. In this paper, we propose an analysis of the X-ray diffraction powder patterns through the Rietveld method, in which the differences between the experimental data and the calculated from a crystalline structure model proposed are evaluated. This result allows to determine the cell parameters, that can be correlated with the Gd concentration, and the existence of other phases with different Gd ratio. (author)

  6. Verification and validation of the PLTEMP/ANL code for thermal hydraulic analysis of experimental and test reactors

    International Nuclear Information System (INIS)

    Kalimullah, M.; Olson, A.O.; Feldman, E.E.; Hanan, N.; Dionne, B.

    2012-01-01

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  7. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  8. Analysis of 99Mo Production Capacity in Uranyl Nitrate Aqueous Homogeneous Reactor using ORIGEN and MCNP

    Directory of Open Access Journals (Sweden)

    A. Isnaeni

    2014-04-01

    Full Text Available 99mTc is a very useful radioisotope in medical diagnostic procedure. 99mTc is produced from 99Mo decay. Currently, most of 99Mo is produced by irradiating 235U in the nuclear reactor. 99Mo mostly results from the fission reaction of 235U targets with a fission yield about 6.1%. A small additional amount is created from 98Mo neutron activation. Actually 99Mo is also created in the reactor fuel, but usually we do not extract it. The fuel will become spent fuel which is a highly radioactive waste. 99Mo production system in the aqueous homogeneous reactor offers a better method, because all of the 99Mo can be extracted from the fuel solution. Fresh reactor fuel solution consists of uranyl nitrate dissolved in water. There is no separation of target and fuel in an aqueous homogeneous reactor where target and fuel become one liquid solution, and there is no spent fuel generated from this reactor. Simulation of the extraction process is performed while reactor in operation (without reactor shutdown. With an extraction flow rate of 3.6 L/h, after 43 hours of reactor operation the production of 99Mo is relatively constant at about 98.6 curie/hour

  9. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    Science.gov (United States)

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a

  10. Comparative analysis of storage conditions and homogenization methods for tick and flea species for identification by MALDI-TOF MS.

    Science.gov (United States)

    Nebbak, A; El Hamzaoui, B; Berenger, J-M; Bitam, I; Raoult, D; Almeras, L; Parola, P

    2017-12-01

    Ticks and fleas are vectors for numerous human and animal pathogens. Controlling them, which is important in combating such diseases, requires accurate identification, to distinguish between vector and non-vector species. Recently, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was applied to the rapid identification of arthropods. The growth of this promising tool, however, requires guidelines to be established. To this end, standardization protocols were applied to species of Rhipicephalus sanguineus (Ixodida: Ixodidae) Latreille and Ctenocephalides felis felis (Siphonaptera: Pulicidae) Bouché, including the automation of sample homogenization using two homogenizer devices, and varied sample preservation modes for a period of 1-6 months. The MS spectra were then compared with those obtained from manual pestle grinding, the standard homogenization method. Both automated methods generated intense, reproducible MS spectra from fresh specimens. Frozen storage methods appeared to represent the best preservation mode, for up to 6 months, while storage in ethanol is also possible, with some caveats for tick specimens. Carnoy's buffer, however, was shown to be less compatible with MS analysis for the purpose of identifying ticks or fleas. These standard protocols for MALDI-TOF MS arthropod identification should be complemented by additional MS spectrum quality controls, to generalize their use in monitoring arthropods of medical interest. © 2017 The Royal Entomological Society.

  11. A novel asymptotic expansion homogenization analysis for 3-D composite with relieved periodicity in the thickness direction

    KAUST Repository

    Nasution, Muhammad Ridlo Erdata

    2014-06-01

    A new asymptotic expansion homogenization analysis is proposed to analyze 3-D composite in which thermomechanical and finite thickness effects are considered. Finite thickness effect is captured by relieving periodic boundary condition at the top and bottom of unit-cell surfaces. The mathematical treatment yields that only 2-D periodicity (i.e. in in-plane directions) is taken into account. A unit-cell representing the whole thickness of 3-D composite is built to facilitate the present method. The equivalent in-plane thermomechanical properties of 3-D orthogonal interlock composites are calculated by present method, and the results are compared with those obtained by standard homogenization method (with 3-D periodicity). Young\\'s modulus and Poisson\\'s ratio obtained by present method are also compared with experiments whereby a good agreement is particularly found for the Young\\'s modulus. Localization analysis is carried out to evaluate the stress responses within the unit-cell of 3-D composites for two cases: thermal and biaxial tensile loading. Standard finite element (FE) analysis is also performed to validate the stress responses obtained by localization analysis. It is found that present method results are in a good agreement with standard FE analysis. This fact emphasizes that relieving periodicity in the thickness direction is necessary to accurately simulate the real free-traction condition in 3-D composite. © 2014 Elsevier Ltd.

  12. A novel asymptotic expansion homogenization analysis for 3-D composite with relieved periodicity in the thickness direction

    KAUST Repository

    Nasution, Muhammad Ridlo Erdata; Watanabe, Naoyuki; Kondo, Atsushi; Yudhanto, Arief

    2014-01-01

    A new asymptotic expansion homogenization analysis is proposed to analyze 3-D composite in which thermomechanical and finite thickness effects are considered. Finite thickness effect is captured by relieving periodic boundary condition at the top and bottom of unit-cell surfaces. The mathematical treatment yields that only 2-D periodicity (i.e. in in-plane directions) is taken into account. A unit-cell representing the whole thickness of 3-D composite is built to facilitate the present method. The equivalent in-plane thermomechanical properties of 3-D orthogonal interlock composites are calculated by present method, and the results are compared with those obtained by standard homogenization method (with 3-D periodicity). Young's modulus and Poisson's ratio obtained by present method are also compared with experiments whereby a good agreement is particularly found for the Young's modulus. Localization analysis is carried out to evaluate the stress responses within the unit-cell of 3-D composites for two cases: thermal and biaxial tensile loading. Standard finite element (FE) analysis is also performed to validate the stress responses obtained by localization analysis. It is found that present method results are in a good agreement with standard FE analysis. This fact emphasizes that relieving periodicity in the thickness direction is necessary to accurately simulate the real free-traction condition in 3-D composite. © 2014 Elsevier Ltd.

  13. The cumulative verification image analysis tool for offline evaluation of portal images

    International Nuclear Information System (INIS)

    Wong, John; Yan Di; Michalski, Jeff; Graham, Mary; Halverson, Karen; Harms, William; Purdy, James

    1995-01-01

    Purpose: Daily portal images acquired using electronic portal imaging devices contain important information about the setup variation of the individual patient. The data can be used to evaluate the treatment and to derive correction for the individual patient. The large volume of images also require software tools for efficient analysis. This article describes the approach of cumulative verification image analysis (CVIA) specifically designed as an offline tool to extract quantitative information from daily portal images. Methods and Materials: The user interface, image and graphics display, and algorithms of the CVIA tool have been implemented in ANSCI C using the X Window graphics standards. The tool consists of three major components: (a) definition of treatment geometry and anatomical information; (b) registration of portal images with a reference image to determine setup variation; and (c) quantitative analysis of all setup variation measurements. The CVIA tool is not automated. User interaction is required and preferred. Successful alignment of anatomies on portal images at present remains mostly dependent on clinical judgment. Predefined templates of block shapes and anatomies are used for image registration to enhance efficiency, taking advantage of the fact that much of the tool's operation is repeated in the analysis of daily portal images. Results: The CVIA tool is portable and has been implemented on workstations with different operating systems. Analysis of 20 sequential daily portal images can be completed in less than 1 h. The temporal information is used to characterize setup variation in terms of its systematic, random and time-dependent components. The cumulative information is used to derive block overlap isofrequency distributions (BOIDs), which quantify the effective coverage of the prescribed treatment area throughout the course of treatment. Finally, a set of software utilities is available to facilitate feedback of the information for

  14. Dependability analysis of systems modeled by non-homogeneous Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Platis, Agapios; Limnios, Nikolaos; Le Du, Marc

    1998-09-01

    The case of time non-homogeneous Markov systems in discrete time is studied in this article. In order to have measures adapted to this kind of systems, some reliability and performability measures are formulated, such as reliability, availability, maintainability and different time variables including new indicators more dedicated to electrical systems like instantaneous expected load curtailed and the expected energy not supplied on a time interval. The previous indicators are also formulated in the case of cyclic chains where asymptotic results can be obtained. The interest of taking into account hazard rate time variation, is to get more accurate and more instructive indicators but also be able to access new performability indicators that cannot be obtained by classical methods. To illustrate this, an example from an Electricite De France electrical substation is solved.

  15. Assessment of Theories for Free Vibration Analysis of Homogeneous and Multilayered Plates

    Directory of Open Access Journals (Sweden)

    Erasmo Carrera

    2004-01-01

    Full Text Available This paper assesses classical and advanced theories for free vibrational response of homogeneous and multilayered simply supported plates. Closed form solutions are given for thick and thin geometries. Single layer and multilayered plates made of metallic, composite and piezo-electric materials, are considered. Classical theories based on Kirchhoff and Reissner-Mindlin assumptions are compared with refined theories obtained by enhancing the order of the expansion of the displacement fields in the thickness direction z. The effect of the Zig-Zag form of the displacement distribution in z as well as of the Interlaminar Continuity of transverse shear and normal stresses at the layer interface were evaluated. A number of conclusions have been drawn. These conclusions could be used as desk-bed in order to choose the most valuable theories for a given problem.

  16. Application of group analysis to the spatially homogeneous and isotropic Boltzmann equation with source using its Fourier image

    International Nuclear Information System (INIS)

    Grigoriev, Yurii N; Meleshko, Sergey V; Suriyawichitseranee, Amornrat

    2015-01-01

    Group analysis of the spatially homogeneous and molecular energy dependent Boltzmann equations with source term is carried out. The Fourier transform of the Boltzmann equation with respect to the molecular velocity variable is considered. The correspondent determining equation of the admitted Lie group is reduced to a partial differential equation for the admitted source. The latter equation is analyzed by an algebraic method. A complete group classification of the Fourier transform of the Boltzmann equation with respect to a source function is given. The representation of invariant solutions and corresponding reduced equations for all obtained source functions are also presented. (paper)

  17. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  18. Optimization of instrumental neutron activation analysis for the within-bottle homogeneity study of reference materials of marine origin

    International Nuclear Information System (INIS)

    Silva, Daniel Pereira da

    2017-01-01

    The use of reference materials has been increasing in chemical analysis laboratories as its use is important for measurement validation in analytical chemistry. Such materials are generally imported, which require high financial investments in order to acquire them, and therefore it impacts on the difficulty to many national laboratories to use reference materials in their chemical analysis routine. Certification of reference materials is a complex process that assumes that the user is given appropriate assigned values of the properties of interests in the material. In this process, the homogeneity of the material must be checked. In this study, the within-bottle homogeneity study for the elements K, Mg, Mn, Na and V was performed for two reference materials of marine origin: the mussel reference material produced at the Neutron Activation Laboratory (LAN) of IPEN - CNEN/SP and an oyster tissue reference material produced abroad. For this purpose, the elements were determined in subsamples with masses varying between 1 and 250 mg by Instrumental Neutron Activation Analysis (INAA) and minimum sample intakes were estimated, ranging from 0.015 g for Na in the mussel reference material to 0.100 g for V in the two reference materials. (author)

  19. A homogenization method for the analysis of a coupled fluid-structure interaction problem with inner solid structure

    International Nuclear Information System (INIS)

    Sigrist, Jean-Francois; Laine, Christian; Broc, Daniel

    2006-01-01

    The present paper exposes a homogenization method developed in order to perform the seismic analysis of a nuclear reactor with internal structures modelling and taking fluid structure interaction effects into account. The numerical resolution of fluid-structure interactions has made tremendous progress over the past decades and some applications of the various developed techniques in the industrial field can be found in the literature. As builder of nuclear naval propulsion reactors (ground prototype reactor or embarked reactor on submarines), DCN Propulsion has been working with French nuclear committee CEA for several years in order to integrate fluid-structure analysis in the design stage of current projects. In previous papers modal and seismic analyses of a nuclear reactor with fluid-structure interaction effect were exposed. The studies highlighted the importance of fluid- structure coupling phenomena in the industrial case and focussed on added mass and added stiffness effects. The numerical model used in the previous studies did not take into account the presence of internal structures within the pressure vessel. The present study aims at improving the numerical model of the nuclear reactor to take into account the presence of the internal structures. As the internal structures are periodical within the inner and outer structure of the pressure vessel the proposed model is based on the development of a homogenization method: the presence of internal structure and its effect on the fluid-structure physical interaction is taken into account, although they are not geometrically modeled. The basic theory of the proposed homogenization method is recalled, leading to the modification of fluid-structure coupling operator in the finite element model. The physical consistency of the method is proved by an evaluation of the system mass with the various mass operators (structure, fluid and fluid-structure operators). The method is exposed and validated in a 2 D case

  20. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  1. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  2. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  3. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  4. Determination of trace elements in dried sea-plant homogenate (SP-M-1) and in dried copepod homogenate (MA-A-1) by means of neutron activation analysis

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Bruin, M. de.

    1977-07-01

    Two marine environmental reference materials were investigated in an intercalibration study of trace elements. According to the specifications from the International Laboratory of Marine Radioactivity at Monaco two samples, a sea-plant homogenate and a copepod homogenate, were analysed by neutron activation analysis. The results of the trace-element analyses were based on dry weight. The moisture content was measured on separate aliquots. For the intercalibration study the following elements were listed as elements of primary interest: mercury, cadmium, lead, manganese, zinc, copper, chromium, silver, iron and nickel. For the 14 elements normally analysed with the routine destructive method, the element gold could not be measured in the two marine samples. This was due to the high residual bromine-82 activity in fraction 13, which contains gold that distills over. With the nondestructive method, nine elements could be assessed, of which only three coincide with the 14 elements of the destructive method. A survey of all measured (trace) elements is presented. The 20 (trace) elements measured in the sea-plant homogenate and in the copepod homogenate comprise 8 out of the 10 trace elements of primary interest, all 5 trace elements of secondary interest (arsenic, cobalt, antimony, selenium and vanadium), and 5 additional (trace) elements. The trace-element determination in both marine biological materials via the destructive procedure was carried out in twelve-fold. In the nondestructive procedure quadruple measurements were performed. For all trace-element levels analysed an average value was calculated

  5. Comparative Analysys of Speech Parameters for the Design of Speaker Verification Systems

    National Research Council Canada - National Science Library

    Souza, A

    2001-01-01

    Speaker verification systems are basically composed of three stages: feature extraction, feature processing and comparison of the modified features from speaker voice and from the voice that should be...

  6. Halal assurance in food supply chains: Verification of halal certificates using audits and laboratory analysis

    NARCIS (Netherlands)

    Spiegel, van der M.; Fels-Klerx, van der H.J.; Sterrenburg, P.; Ruth, van S.M.; Scholtens-Toma, I.M.J.; Kok, E.J.

    2012-01-01

    The global halal market is increasing. Worldwide a large number of standardization and certification organizations has been established. This article discusses halal requirements, summarizes applied standards and certification, and explores current verification of halal certificates using audits and

  7. Early chemical enrichment of the Galactic dwarf satellites from a homogeneous and NLTE abundance analysis

    Science.gov (United States)

    Mashonkina, Lyudmila; Jablonka, Pascale; Sitnova, Tatyana; Pakhomov, Yuri; North, Pierre

    2018-06-01

    We review recent abundance results for very metal-poor (VMP, -4 ≤ [Fe/H] ≤ -2) stars in seven dwarf spheroidal galaxies (dSphs) and in the Milky Way (MW) halo comparison sample that were obtained based on high-resolution spectroscopic datasets, homogeneous and accurate atmospheric parameters, and the non-local thermodynamic equilibrium (NLTE) line formation for 10 chemical species. A remarkable gain of using such an approach is the reduction, compared to a simple compilation of the literature data, of the spread in abundance ratios at given metallicity within each galaxy and from one to the other. We show that all massive galaxies in our sample, that is, the MW halo and the classical dSphs Sculptor, Ursa Minor, Sextans, and Fornax, reveal a similar plateau at [α/Fe] \\simeq 0.3 for each of the α-process elements: Mg, Ca, and Ti. We put on a firm ground the evidence for a decline in α/Fe with increasing metallicity in the Boötes I ultra-faint dwarf galaxy (UFD), that is most probably due to the ejecta of type Ia supernovae. In our classical dSphs, we observe the dichotomy in the [Sr/Ba] versus [Ba/H] diagram, similarly to the MW halo, calling for two different nucleosynthesis channels for Sr at the earliest evolution stages of these galaxies. Our three UFDs, that is Boötes I, UMa II, and Leo IV, are depleted in Sr and Ba relative to Fe and Mg, with very similar ratios of [Sr/Mg] ≈ -1.3 and [Ba/Mg] ≈ -1 on the entire range of their Mg abundances. The subsolar Sr/Ba ratios of Boötes I and UMa II indicate a common r-process origin of their neutron-capture elements. For Na/Fe, Na/Mg, and Al/Mg, the MW halo and all dSphs reveal indistinguishable trends with metallicity, suggesting that the processes of Na and Al synthesis are identical in all systems, independent of their mass. Sculptor remains the classical dSph, in which the evidence for inhomogeneous mixing in the early evolution stage, at [Fe/H] < -2, is the strongest.

  8. A spatial analysis of patterns of growth and concentration of population based on homogeneous population censuses: Spain (1877-2001

    Directory of Open Access Journals (Sweden)

    Xavier Franch Auladell

    2013-01-01

    Full Text Available This work constitutes a contribution to the analysis of long term patterns of population concentration applied to the case of Spain. The proposed methodology is based on the homogenisation of both data and administrative units which takes the municipal structure of the 2001 census as its base reference. This work seeks to show how applying spatial analysis techniques to this type of homogeneous data series allows us to make more detailed studies of population patterns within a given territory. The most important conclusions that we reached was that, in Spain, sustained population growth has followed a spatial pattern that has become increasingly consolidated over time. The tendencies observed have produced an uneven distribution of population within the national territory marked by the existence of a series of well-defined, and often very localised, areas that spread beyond the limits of the official administrative boundaries.

  9. Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification

    Directory of Open Access Journals (Sweden)

    Zheng Hui-Fan

    2013-01-01

    Full Text Available A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good agreement with the experimental data. The results show that heat capacity and the heat transfer coefficient of the micro-fin evaporator increase with increasing logarithmic mean temperature difference, the water mass flow rate and the refrigerant mass flow rate. Heat capacity of the micro-fin evaporator for diameter 9.52 mm is higher than that of diameter 7.00 mm with using R290 as refrigerant. Heat capacity of the micro-fin evaporator with using R717 as refrigerant is higher than that of R290 as refrigerant. The results of this study can provide useful guidelines for optimal design and operation of micro-fin evaporator in its present or future applications.

  10. Analysis of human scream and its impact on text-independent speaker verification.

    Science.gov (United States)

    Hansen, John H L; Nandwana, Mahesh Kumar; Shokouhi, Navid

    2017-04-01

    Scream is defined as sustained, high-energy vocalizations that lack phonological structure. Lack of phonological structure is how scream is identified from other forms of loud vocalization, such as "yell." This study investigates the acoustic aspects of screams and addresses those that are known to prevent standard speaker identification systems from recognizing the identity of screaming speakers. It is well established that speaker variability due to changes in vocal effort and Lombard effect contribute to degraded performance in automatic speech systems (i.e., speech recognition, speaker identification, diarization, etc.). However, previous research in the general area of speaker variability has concentrated on human speech production, whereas less is known about non-speech vocalizations. The UT-NonSpeech corpus is developed here to investigate speaker verification from scream samples. This study considers a detailed analysis in terms of fundamental frequency, spectral peak shift, frame energy distribution, and spectral tilt. It is shown that traditional speaker recognition based on the Gaussian mixture models-universal background model framework is unreliable when evaluated with screams.

  11. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    Science.gov (United States)

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  12. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  13. Homogeneity and Entropy

    Science.gov (United States)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  14. Analysis and optimization of gyrokinetic toroidal simulations on homogenous and heterogenous platforms

    International Nuclear Information System (INIS)

    Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; Wang, Bei; Oliker, Leonid

    2013-01-01

    The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.

  15. A verification study and trend analysis of simulated boundary layer wind fields over Europe

    Energy Technology Data Exchange (ETDEWEB)

    Lindenberg, Janna

    2011-07-01

    Simulated wind fields from regional climate models (RCMs) are increasingly used as a surrogate for observations which are costly and prone to homogeneity deficiencies. Compounding the problem, a lack of reliable observations makes the validation of the simulated wind fields a non trivial exercise. Whilst the literature shows that RCMs tend to underestimate strong winds over land these investigations mainly relied on comparisons with near surface measurements and extrapolated model wind fields. In this study a new approach is proposed using measurements from high towers and a robust validation process. Tower height wind data are smoother and thus more representative of regional winds. As benefit this approach circumvents the need to extrapolate simulated wind fields. The performance of two models using different downscaling techniques is evaluated. The influence of the boundary conditions on the simulation of wind statistics is investigated. Both models demonstrate a reasonable performance over flat homogeneous terrain and deficiencies over complex terrain, such as the Upper Rhine Valley, due to a too coarse spatial resolution ({proportional_to}50 km). When the spatial resolution is increased to 10 and 20 km respectively a benefit is found for the simulation of the wind direction only. A sensitivity analysis shows major deviations of international land cover data. A time series analysis of dynamically downscaled simulations is conducted. While the annual cycle and the interannual variability are well simulated, the models are less effective at simulating small scale fluctuations and the diurnal cycle. The hypothesis that strong winds are underestimated by RCMs is supported by means of a storm analysis. Only two-thirds of the observed storms are simulated by the model using a spectral nudging approach. In addition ''False Alarms'' are simulated, which are not detected in the observations. A trend analysis over the period 1961 - 2000 is conducted

  16. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  17. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  18. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    Science.gov (United States)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  19. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    Science.gov (United States)

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (ponline age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, ponline e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  1. Taxonomic analysis of perceived risk: modeling individual and group perceptions within homogeneous hazard domains

    International Nuclear Information System (INIS)

    Kraus, N.N.; Slovic, P.

    1988-01-01

    Previous studies of risk perception have typically focused on the mean judgments of a group of people regarding the riskiness (or safety) of a diverse set of hazardous activities, substances, and technologies. This paper reports the results of two studies that take a different path. Study 1 investigated whether models within a single technological domain were similar to previous models based on group means and diverse hazards. Study 2 created a group taxonomy of perceived risk for only one technological domain, railroads, and examined whether the structure of that taxonomy corresponded with taxonomies derived from prior studies of diverse hazards. Results from Study 1 indicated that the importance of various risk characteristics in determining perceived risk differed across individuals and across hazards, but not so much as to invalidate the results of earlier studies based on group means and diverse hazards. In Study 2, the detailed analysis of railroad hazards produced a structure that had both important similarities to, and dissimilarities from, the structure obtained in prior research with diverse hazard domains. The data also indicated that railroad hazards are really quite diverse, with some approaching nuclear reactors in their perceived seriousness. These results suggest that information about the diversity of perceptions within a single domain of hazards could provide valuable input to risk-management decisions

  2. Annual and seasonal analysis of temperature and precipitation in Andorra (Pyrenees) from 1934 to 2008: quality check, homogenization and trends

    Science.gov (United States)

    Esteban, Pere; Prohom, Marc; Aguilar, Enric; Mestre, Olivier

    2010-05-01

    The analysis of temperature and precipitation change and variability in high elevations is a difficult issue due to the lack of long term climatic series in those environments. Nonetheless, it is important to evaluate how much high elevations follow the same climate evolution than low lying sites. In this work, using daily data from three Andorran weather stations (maintained by the power company Forces Elèctriques d'Andorra, FEDA), climate trends of annual and seasonal temperature and precipitation were obtained for the period 1934-2008. The series are complete (99.9%) and are located in a mountainous area ranging from 1110 m to 1600 m asl. As a previous step to the analysis, data rescue, quality control and homogeneity tests were applied to the daily data. For quality control, several procedures were applied to identify and flag suspicious or erroneous data: duplicated days, outliers, excessive differences between consecutive days, flat line checking, days with maximum temperature lower that minimum temperature, and rounding analysis. All the station sites were visited to gather the available metadata. Concerning homogeneity, a homogeneous climate time series is defined as one where variations are caused only by variations in climate and not to non-climatic factors (i.e., changes in site location, instruments, station environment…). As a result, homogeneity of the series was inspected from several methodologies that have been used in a complementary and independent way in order to attain solid results: C3-SNHT (with software developed under the Spanish Government Grant CGL2007-65546-C03-02), and Caussinus-Mestre (C-M) approaches. In both cases, tests were applied to mean annual temperature and precipitation series, using Catalan and French series as references (provided respectively by the Meteorological Service of Catalonia and Météo-France, in the framework of the Action COST-ES0601: Advances in homogenisation methods of climate series: an integrated

  3. Preparation of in-house calibration source for the use in radioactivity analysis of the environmental samples. Consideration of homogeneity

    International Nuclear Information System (INIS)

    Aba, A.; Ismaeel, A.

    2013-01-01

    An in-house reference material has been prepared in Kuwait Institute for Scientific Research radioecology laboratory, for quality control purposes of gamma spectrometer systems. The material contains a known amount of uranium ore reference material (prepared by the International Atomic Energy Agency and coded as IAEA-RGU-1) which is mixed with marine sediment collected from Kuwait bay. The IAEA-RGU-1 has been certified that it is in equilibrium state with the decay daughters, and stable to be used for quality control purposes. Nevertheless, the homogeneous distribution of the doped material with the prepared source should be verified. This has been examined using gamma spectrometry measurements in conjunction with analysis of variance statistical tools, Dixon, box plots and Grubbs tests. The calculated total uncertainty has been utilized to establish the recommended specific activity ranges of 226 Ra, 224 Th, 214 Pb, 214 Bi and 210 Pb radioisotopes in the prepared source. The obtained results showed that the estimated uncertainty arising from the sample inhomogeneity has a significant contribution in the total uncertainty. The stability control charts of the ultra-low background gamma spectrometry system demonstrated the suitability of the prepared material for the purpose of quality control. However, the emitted gamma-rays from the prepared source covers the required energy range for determination of natural and artificial radionuclides in different species of environmental samples such as marine sediment, soil samples, and samples contaminated by naturally occurring radioactive material produced by oil industry. In addition, the material might be used for system calibration in case its traceability is proven. The experimental data revealed the significance of the homogeneity in preparing environmental samples for radioactivity measurements; in particular when small sample quantities of environmental samples are required to be analyzed. (author)

  4. Study and characterization of arrays of detectors for dosimetric verification of radiotherapy, analysis of business solutions

    International Nuclear Information System (INIS)

    Gago Arias, A.; Brualla Gonzalez, L.; Gomez Rodriguez, F.; Gonzalez Castano, D. M.; Pardo Montero, J.; Luna Vega, V.; Mosquera Sueiro, J.; Sanchez Garcia, M.

    2011-01-01

    This paper presents a comparative study of the detector arrays developed by different business houses to the demand for devices that speed up the verification process. Will analyze the effect of spatial response of individual detectors in the measurement of dose distributions, modeling the same and analyzing the ability of the arrays to detect variations in a treatment yield.

  5. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  6. Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems

    NARCIS (Netherlands)

    Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio

    2006-01-01

    This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops

  7. Influence of microstructure on the thermal creep behaviour of zirconium alloys: experimental analysis and implementation of homogenization approaches

    International Nuclear Information System (INIS)

    Brenner, R.

    2001-01-01

    Zirconium alloys widely used in the nuclear industry can present thermomechanical variability of their behavior (especially for thermal creep) as a function of their microstructure. To have a better control of the mechanical behavior of these alloys and also to take into account the possible evolution of their fabrication process (chemical composition, thermal treatments,... ), it is important to have a modeling tool which help us to describe the relationship between the microstructure and the macroscopic behavior. This study contributes to establish a predictive modelling, based on an experimental analysis coupled with a homogenization approach of the thermal creep behavior of Zr alloys. The experimental analysis of the crystallographic texture effect for Zircaloy-4 alloys shows how the strain rate and stress exponent of the different glide systems are anisotropic. Transmission Electronic Microscopy analysis have been undertaken in order to determine the link between the texture and the activated slip system considering various mechanical tests (Ioading paths). The experimental analysis for Zr-Nb-1%-O bring to evidence the solid solution effect of Nb on the hardening of this alloy and the weak effect of the precipitates distribution on thermal creep behavior. An elasto-viscoplastic micromechanical modelling has been developed taking into account the microstructure effects on the macroscopic behavior of Zr alloys. The 'quasi-elastic' approximate of the self consistent scheme based on the affine formulation is proposed and compared with others and earlier formulations. The accuracy of this formulation for our study is demonstrated, as well as the from the scale transition point of view and the simple numerical resolution. A good agreement is found for the description of thermal creep behavior of Zircaloy-4 and Zr-Nb-1%-O alloys. The analysis of the results at a local scale (especially slip system secondary activities) gives the current limit for the description of

  8. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  9. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  10. Verification and benchmarking of PORFLO: an equivalent porous continuum code for repository scale analysis

    International Nuclear Information System (INIS)

    Eyler, L.L.; Budden, M.J.

    1984-11-01

    The objective of this work was to perform an assessment of prediction capabilities and features of the PORFLO code in relation to its intended use in the Basalt Waste Isolation Project. This objective was to be accomplished through a code verification and benchmarking task. Results were to be documented which either support correctness of prediction capabilities or identify areas of intended application in which the code exhibits weaknesses. A test problem set consisting of 10 problems was developed. Results of PORFLO simulations of these problems were provided for use in this work. The 10 problems were designed to test the three basic computational capabilities or categories of the code. Broken down by physical process, these are heat transfer, fluid flow, and radionuclide transport. Two verification problems were included within each of these categories. They were problems designed to test basic features of PORFLO for which analytical solutions are available for use as a known comparison basis. Hence they are referred to as verification problems. Of the remaining four problems, one repository scale problem representative of intended PORFLO use within BWIP was included in each of the three basic capabilities categories. The remaining problem was a case specifically designed to test features of decay and retardation in radionuclide transport. These four problems are referred to as benchmarking problems, because results computed with an additional computer code were used as a basis for comparison. 38 figures

  11. Optimal design for crosstalk analysis in 12-core 5-LP mode homogeneous multicore fiber for different lattice structure

    Science.gov (United States)

    Kumar, Dablu; Ranjan, Rakesh

    2018-03-01

    12-Core 5-LP mode homogeneous multicore fibers have been proposed for analysis of inter-core crosstalk and dispersion, with four different lattice structures (circular, 2-ring, square lattice, and triangular lattice) having cladding diameter of 200 μm and a fixed cladding thickness of 35 μm. The core-to-core crosstalk impact has been studied numerically with respect to bending radius, core pitch, transmission distance, wavelength, and core diameter for all 5-LP modes. In anticipation of further reduction in crosstalk levels, the trench-assisted cores have been incorporated for all respective designs. Ultra-low crosstalk (-138 dB/100 km) has been achieved through the triangular lattice arrangement, with trench depth Δ2 = -1.40% for fundamental (LP01) mode. It has been noted that the impact of mode polarization on crosstalk behavior is minor, with difference in crosstalk levels between two polarized spatial modes as ≤0.2 dB. Moreover, the optimized cladding diameter has been obtained for all 5-LP modes for a target value of crosstalk of -50 dB/100 km, with all the core arrangements. The dispersion characteristic has also been analyzed with respect to wavelength, which is nearly 2.5 ps/nm km at operating wavelength 1550 nm. The relative core multiplicity factor (RCMF) for the proposed design is obtained as 64.

  12. An Analysis of Nonlinear Elastic Deformations for a Homogeneous Beam at Varying Tip Loads and Pitch Angles

    National Research Council Canada - National Science Library

    McGraw, Robert J

    2006-01-01

    .... The recorded data, specifically for homogeneous beams of 7075 aluminum, have been referenced as a baseline for the past thirty years to validate numerous computer models and theories in an effort...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, UNIVERSITY OF TENNESSEE RESEARCH CORPORATION, SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  14. Quality Assurance in Environmental Technology Verification (ETV): Analysis and Impact on the EU ETV Pilot Programme Performance

    Science.gov (United States)

    Molenda, Michał; Ratman-Kłosińska, Izabela

    2018-03-01

    Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.

  15. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  16. Homogeneous crystal nucleation in polymers.

    Science.gov (United States)

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  17. Whole-core thermal-hydraulic transient code development and verification for LMFBR analysis

    International Nuclear Information System (INIS)

    Spencer, D.R.

    1979-04-01

    Predicted performance during both steady state and transient reactor operation determines the steady state operating limits on LMFBRs. Unnecessary conservatism in performance predictions will not contribute to safety, but will restrict the reactor to more conservative, less economical steady state operation. The most general method for reducing analytical conservatism in LMFBR's without compromising safety is to develop, validate and apply more sophisticated computer models to the limiting performance analyses. The purpose of the on-going Natural Circulation Verification Program (NCVP) is to develop and validate computer codes to analyze natural circulation transients in LMFBRs, and thus, replace unnecessary analytical conservatism with demonstrated calculational capability

  18. HOMOGENEOUS UGRIZ PHOTOMETRY FOR ACS VIRGO CLUSTER SURVEY GALAXIES: A NON-PARAMETRIC ANALYSIS FROM SDSS IMAGING

    International Nuclear Information System (INIS)

    Chen, Chin-Wei; Cote, Patrick; Ferrarese, Laura; West, Andrew A.; Peng, Eric W.

    2010-01-01

    We present photometric and structural parameters for 100 ACS Virgo Cluster Survey (ACSVCS) galaxies based on homogeneous, multi-wavelength (ugriz), wide-field SDSS (DR5) imaging. These early-type galaxies, which trace out the red sequence in the Virgo Cluster, span a factor of nearly ∼10 3 in g-band luminosity. We describe an automated pipeline that generates background-subtracted mosaic images, masks field sources and measures mean shapes, total magnitudes, effective radii, and effective surface brightnesses using a model-independent approach. A parametric analysis of the surface brightness profiles is also carried out to obtain Sersic-based structural parameters and mean galaxy colors. We compare the galaxy parameters to those in the literature, including those from the ACSVCS, finding good agreement in most cases, although the sizes of the brightest, and most extended, galaxies are found to be most uncertain and model dependent. Our photometry provides an external measurement of the random errors on total magnitudes from the widely used Virgo Cluster Catalog, which we estimate to be σ(B T )∼ 0.13 mag for the brightest galaxies, rising to ∼ 0.3 mag for galaxies at the faint end of our sample (B T ∼ 16). The distribution of axial ratios of low-mass ( d warf ) galaxies bears a strong resemblance to the one observed for the higher-mass ( g iant ) galaxies. The global structural parameters for the full galaxy sample-profile shape, effective radius, and mean surface brightness-are found to vary smoothly and systematically as a function of luminosity, with unmistakable evidence for changes in structural homology along the red sequence. As noted in previous studies, the ugriz galaxy colors show a nonlinear but smooth variation over a ∼7 mag range in absolute magnitude, with an enhanced scatter for the faintest systems that is likely the signature of their more diverse star formation histories.

  19. Statistical analysis of the phytocoenose homogeneity. IV. Species number and mean biomass value as functions of the area size

    Directory of Open Access Journals (Sweden)

    Anna J. Kwiatkowska

    2014-01-01

    Full Text Available Homogeneity of the Leocobryo-Pineturn phytocoenose was assessed on the grounds of the effect of area size on the species number and mean biomass value. It was confirmed that: I species number was a logarithmic function of the area size; 2 relation of individual species biomass to the area size was, as a rule, other than rectilinear, 3 the size of phytocoenose floristicly representative area differed from that determined with respect to the standing biomass and 4 phytocoenose homogeneity is related to the scale defined by the size of representative area.

  20. Homogeneous Spaces and Equivariant Embeddings

    CERN Document Server

    Timashev, DA

    2011-01-01

    Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em

  1. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  2. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    Science.gov (United States)

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  3. Analysis of the differences in breeding ratio and fissile inventory between heterogeneous and homogeneous liquid-metal fast breeder reactors

    International Nuclear Information System (INIS)

    Tzanos, C.P.

    1980-01-01

    The differences in fissile inventory and breeding ratio, with respect to the differences in fertile inventory and neutron spectrum, between equivalent heterogeneous and homogeneous configurations were analyzed. To quantify the effect of spectral changes on reaction rate ratios, a calculational scheme based on properly prepared one-group cross-section sets was used

  4. Analysis of viticultural potential and delineation of homogeneous viticultural zones in a temperate climate region of Romania

    Directory of Open Access Journals (Sweden)

    Liviu Mihai Irimia

    2014-09-01

    Significance and impact of the study: This study provides the necessary information for viticultural zoning in the Huşi wine growing region in Romania. The methodology allows to evaluate viticultural potential and to delineate homogeneous viticultural zones in wine growing regions with a temperate continental climate.

  5. Assessment of homogeneity of candidate reference material at the nanogram level and investigation on representativeness of single particle analysis using electron probe X ray microanalysis

    International Nuclear Information System (INIS)

    Ro, Chul-Un; Hoornaerta, S.; Griekena, R. van

    2002-01-01

    Particulate samples of a candidate reference material are evaluated on their homogeneity from bottle to bottle using electron probe X ray microanalysis technique. The evaluation on the homogeneity is done by the utilization of the Kolmogorov-Smirnov statistics to the processing of the quantitative electron probe X ray microanalysis data. Due to a limitation, existing even in computer controlled electron probe X ray microanalysis, in terms of analysis time and expenses, the number of particles analyzed is much smaller compared to that in the sample. Therefore, it is investigated whether this technique provides representative analysis results for the characteristics of the sample, even though a very small portion of the sample is really analyzed. Furthermore, the required number of particles for the analysis, to insure a certain level of reproducibility, e.g. 5% relative standard deviation, is determined by the application of the Ingamells sampling theory. (author)

  6. Application of FE-analysis in Design and Verification of Bolted Joints according to VDI 2230 at CERN

    CERN Document Server

    AUTHOR|(CDS)2225945; Dassa, Luca; Welo, Torgeir

    This thesis investigates how finite element analysis (FEA) can be used to simplify and improve analysis of bolted joints according to the guideline VDI 2230. Some aspects of how FEA can be applied to aid design and verification of bolted joints are given in the guideline, but not in a streamlined way that makes it simple and efficient to apply. The scope of this thesis is to clarify how FEA and VDI 2230 can be combined in analysis of bolted joints, and to present a streamlined workflow. The goal is to lower the threshold for carrying out such combined analysis. The resulting benefits are improved analysis validity and quality, and improved analysis efficiency. A case from the engineering department at CERN, where FEA has been used in analysis of bolted joints is used as basis to identify challenges in combining FEA and VDI 2230. This illustrates the need for a streamlined analysis strategy and well described workflow. The case in question is the Helium vessel (pressure vessel) for the DQW Crab Cavities, whi...

  7. Evolution of metastable phases in silicon during nanoindentation: mechanism analysis and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Mylvaganam, K [Centre for Advanced Materials Technology, University of Sydney, NSW 2006 (Australia); Zhang, L C [School of Mechanical and Manufacturing Engineering, University of New South Wales, NSW 2052 (Australia); Eyben, P; Vandervorst, W [IMEC, Kapeldreef 75, B-3001 Leuven (Belgium); Mody, J, E-mail: k.mylvaganam@usyd.edu.a, E-mail: Liangchi.zhang@unsw.edu.a, E-mail: eyben@imec.b, E-mail: jamody@imec.b, E-mail: vdvorst@imec.b [KU Leuven, Electrical Engineering Department, INSYS, Kasteelpark Arenberg 10, B-3001 Leuven (Belgium)

    2009-07-29

    This paper explores the evolution mechanisms of metastable phases during the nanoindentation on monocrystalline silicon. Both the molecular dynamics (MD) and the in situ scanning spreading resistance microscopy (SSRM) analyses were carried out on Si(100) orientation, and for the first time, experimental verification was achieved quantitatively at the same nanoscopic scale. It was found that under equivalent indentation loads, the MD prediction agrees extremely well with the result experimentally measured using SSRM, in terms of the depth of the residual indentation marks and the onset, evolution and dimension variation of the metastable phases, such as {beta}-Sn. A new six-coordinated silicon phase, Si-XIII, transformed directly from Si-I was discovered. The investigation showed that there is a critical size of contact between the indenter and silicon, beyond which a crystal particle of distorted diamond structure will emerge in between the indenter and the amorphous phase upon unloading.

  8. Verification test for on-line diagnosis algorithm based on noise analysis

    International Nuclear Information System (INIS)

    Tamaoki, T.; Naito, N.; Tsunoda, T.; Sato, M.; Kameda, A.

    1980-01-01

    An on-line diagnosis algorithm was developed and its verification test was performed using a minicomputer. This algorithm identifies the plant state by analyzing various system noise patterns, such as power spectral densities, coherence functions etc., in three procedure steps. Each obtained noise pattern is examined by using the distances from its reference patterns prepared for various plant states. Then, the plant state is identified by synthesizing each result with an evaluation weight. This weight is determined automatically from the reference noise patterns prior to on-line diagnosis. The test was performed with 50 MW (th) Steam Generator noise data recorded under various controller parameter values. The algorithm performance was evaluated based on a newly devised index. The results obtained with one kind of weight showed the algorithm efficiency under the proper selection of noise patterns. Results for another kind of weight showed the robustness of the algorithm to this selection. (orig.)

  9. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  10. The homogeneous geometries of real hyperbolic space

    DEFF Research Database (Denmark)

    Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis

    We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use...... our analysis to show that the moduli space of homogeneous structures on real hyperbolic space has two connected components....

  11. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  12. Analysis and Verification of Message Sequence Charts of Distributed Systems with the Help of Coloured Petri Nets

    Directory of Open Access Journals (Sweden)

    S. A. Chernenok

    2014-01-01

    Full Text Available The standard language of message sequence charts MSC is intended to describe scenarios of object interaction. Due to their expressiveness and simplicity MSC diagrams are widely used in practice at all stages of system design and development. In particular, the MSC language is used for describing communication behavior in distributed systems and communication protocols. In this paper the method for analysis and verification of MSC and HMSC diagrams is considered. The method is based on the translation of (HMSC into coloured Petri nets. The translation algorithms cover most standard elements of the MSC including data concepts. Size estimates of the CPN which is the result of the translation are given. Properties of the resulting CPN are analyzed and verified by using the known system CPN Tools and the CPN verifier based on the known tool SPIN. The translation method has been demonstrated by the example.

  13. Finite element program ARKAS: verification for IAEA benchmark problem analysis on core-wide mechanical analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    ''ARKAS'' code verification, with the problems set in the International Working Group on Fast Reactors (IWGFR) Coordinated Research Programme (CRP) on the inter-comparison between liquid metal cooled fast breeder reactor (LMFBR) Core Mechanics Codes, is discussed. The CRP was co-ordinated by the IWGFR around problems set by Dr. R.G. Anderson (UKAEA) and arose from the IWGFR specialists' meeting on The Predictions and Experience of Core Distortion Behaviour (ref. 2). The problems for the verification (''code against code'') and validation (''code against experiment'') were set and calculated by eleven core mechanics codes from nine countries. All the problems have been completed and were solved with the core structural mechanics code ARKAS. Predictions by ARKAS agreed very well with other solutions for the well-defined verification problems. For the validation problems based on Japanese ex-reactor 2-D thermo-elastic experiments, the agreements between measured and calculated values were fairly good. This paper briefly describes the numerical model of the ARKAS code, and discusses some typical results. (author)

  14. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  15. Pre-Analysis for Safety-Related Verification Test Using TASS/SMR Code

    Energy Technology Data Exchange (ETDEWEB)

    Ra, I. S.; Kim, H. J.; Jeon, G. H. [ACTS Ltd., Daejeon (Korea, Republic of)

    2010-01-15

    General trends of TASS/SMR simulation were similar to those in both ORNF test and BENNETT test conducted to verify core heat transfer model in TASS/SMR. In high mass flux, however, a CHF location in the analytical result of TASS/SMR was greatly deviated from BENNETT test result. TASS/SMR gave better results in heterogeneous option that in homogeneous option in both KIT test, which was a steady state test with an inlet flow, and GE-LEVEL Swell test, which a transient test without an inlet flow. TASS/SMR simulation for SMD Long and Short test gave a good agreement with the test results in showing a reasonable predictability of critical flow model. But, in the case of Marviken test, the analytical result was not similar to the test result after the timing of vapor generation

  16. Particle size analysis of lamb meat: Effect of homogenization speed, comparison with myofibrillar fragmentation index and its relationship with shear force.

    Science.gov (United States)

    Karumendu, L U; Ven, R van de; Kerr, M J; Lanza, M; Hopkins, D L

    2009-08-01

    The impact of homogenization speed on Particle Size (PS) results was examined using samples from the M.longissimus thoracis et lumborum (LL) of 40 lambs. One gram duplicate samples from meat aged for 1 and 5days were homogenized at five different speeds; 11,000, 13,000, 16,000, 19,000 and 22,000rpm. In addition to this LL samples from 30 different lamb carcases also aged for 1 and 5days were used to study the comparison between PS and myofibrillar fragmentation index (MFI) values. In this case, 1g duplicate samples (n=30) were homogenized at 16,000rpm and the other half (0.5g samples) at 11,000rpm (n=30). The homogenates were then subjected to respective combinations of treatments which included either PS analysis or the determination of MFI, both with or without three cycles of centrifugation. All 140 samples of LL included 65g blocks for subsequent shear force (SF) testing. Homogenization at 16,000rpm provided the greatest ability to detect ageing differences for particle size between samples aged for 1 and 5days. Particle size at the 25% quantile provided the best result for detecting differences due to ageing. It was observed that as ageing increased the mean PS decreased and was significantly (P<0.001) less for 5days aged samples compared to 1day aged samples, while MFI values significantly increased (P<0.001) as ageing period increased. When comparing the PS and MFI methods it became apparent that, as opposed to the MFI method, there was a greater coefficient of variation for the PS method which warranted a quality assurance system. Given this requirement and examination of the mean, standard deviation and the 25% quantile for PS data it was concluded that three cycles of centrifugation were not necessary and this also applied to the MFI method. There were significant correlations (P<0.001) within the same lamb loin sample aged for a given period between mean MFI and mean PS (-0.53), mean MFI and mean SF (-0.38) and mean PS and mean SF (0.23). It was

  17. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  18. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    Science.gov (United States)

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  19. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 2, structural implementation and validation

    Science.gov (United States)

    Milani, G.; Bertolesi, E.

    2017-07-01

    The simple quasi analytical holonomic homogenization approach for the non-linear analysis of in-plane loaded masonry presented in Part 1 is here implemented at a structural leveland validated. For such implementation, a Rigid Body and Spring Mass model (RBSM) is adopted, relying into a numerical modelling constituted by rigid elements interconnected by homogenized inelastic normal and shear springs placed at the interfaces between adjoining elements. Such approach is also known as HRBSM. The inherit advantage is that it is not necessary to solve a homogenization problem at each load step in each Gauss point, and a direct implementation into a commercial software by means of an external user supplied subroutine is straightforward. In order to have an insight into the capabilities of the present approach to reasonably reproduce masonry behavior at a structural level, non-linear static analyses are conducted on a shear wall, for which experimental and numerical data are available in the technical literature. Quite accurate results are obtained with a very limited computational effort.

  20. Performance Verification of GOSAT-2 FTS-2 Simulator and Sensitivity Analysis for Greenhouse Gases Retrieval

    Science.gov (United States)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Matsunaga, T.

    2015-12-01

    performance verification of the GOSAT-2 FTS-2 simulator and describe the future prospects for Level 2 retrieval. Besides, we will present the various sensitivity analyses relating to the engineering parameters and the atmospheric conditions on Level 1 processing for greenhouse gases retrieval.

  1. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  2. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  3. Functionality and homogeneity.

    NARCIS (Netherlands)

    2011-01-01

    Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass,

  4. Homogenization of Mammalian Cells.

    Science.gov (United States)

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  5. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  6. Analysis of space and energy homogenization techniques for the solving of the neutron transport equation in nuclear reactor

    International Nuclear Information System (INIS)

    Magat, Ph.

    1997-04-01

    Today neutron transport in PWR's core is routinely computed through the transport-diffusion(2 groups) scheme. This method gives satisfactory results for reactors operating in normal conditions but the 2 group diffusion approximation is unable to take into account interface effects or anisotropy. The improvement of this scheme is logically possible through the use of a simplified P N method (SP N ) for the modeling of the core. The comparison between S N calculations and SP N calculations shows an excellent agreement on eigenvalues as well as on power maps. We can notice that: -) it is no use extending the development beyond P 3 , there is no effect; -) the P 1 development is adequate; and -) the P 0 development is totally inappropriate. Calculations performed on the N4 core of the Chooz power plant have enabled us to compare diffusion operators with transport operators (SP 1 , SP 3 , SP 5 and SP 7 ). These calculations show that the implementation of the SP N method is feasible but the extra-costs in computation times and memory are important. We recommend: SP 5 P 1 calculations for heterogeneous 2-dimension geometry and SP 3 P 1 calculations for the homogeneous 3-dimension geometry. (A.C.)

  7. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  8. Verification of computed tomographic estimates of cochlear implant array position: a micro-CT and histologic analysis.

    Science.gov (United States)

    Teymouri, Jessica; Hullar, Timothy E; Holden, Timothy A; Chole, Richard A

    2011-08-01

    To determine the efficacy of clinical computed tomographic (CT) imaging to verify postoperative electrode array placement in cochlear implant (CI) patients. Nine fresh cadaver heads underwent clinical CT scanning, followed by bilateral CI insertion and postoperative clinical CT scanning. Temporal bones were removed, trimmed, and scanned using micro-CT. Specimens were then dehydrated, embedded in either methyl methacrylate or LR White resin, and sectioned with a diamond wafering saw. Histology sections were examined by 3 blinded observers to determine the position of individual electrodes relative to soft tissue structures within the cochlea. Electrodes were judged to be within the scala tympani, scala vestibuli, or in an intermediate position between scalae. The position of the array could be estimated accurately from clinical CT scans in all specimens using micro-CT and histology as a criterion standard. Verification using micro-CT yielded 97% agreement, and histologic analysis revealed 95% agreement with clinical CT results. A composite, 3-dimensional image derived from a patient's preoperative and postoperative CT images using a clinical scanner accurately estimates the position of the electrode array as determined by micro-CT imaging and histologic analyses. Information obtained using the CT method provides valuable insight into numerous variables of interest to patient performance such as surgical technique, array design, and processor programming and troubleshooting.

  9. Development and verification of local/global analysis techniques for laminated composites

    Science.gov (United States)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  10. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  11. Thermal–structural analysis of ITER triangular support for dominant load verification

    International Nuclear Information System (INIS)

    Kim, Yu-Gyeong; Hwang, Jong-Hyun; Jung, Yung-Jin; Kim, Hyun-Soo; Ahn, Hee-Jae

    2014-01-01

    Highlights: • The load combination method is introduced to thermal–structural analysis for contradictive loads occurred simultaneously. • The one-way coupling analysis also conducted for thermal–structural analysis and its validity is checked by comparing with the load combination. • The dominant load for triangular support bracket is determined as the baking condition. - Abstract: The triangular support is located on the lower inner shell of vacuum vessel of ITER, which should be designed to withstand various loads such as nuclear heat, coolant pressure and so on. The appropriateness of its design is evaluated under the dominant load that could represent the most conservative condition among the design loads. In order to decide the dominant load, a valid method for thermal–structural analysis is firstly verified considering contradictory behaviors between heat and structural loads. In this paper, two approaches; one-way coupling and load combination, are introduced for thermal–structural analysis. The one-way coupling is a method generally used but has a limit to apply on contradictory conditions. The load combination could give a proper solution since it evaluates each load independently and then adds up each result linearly. Based on the results of each case, structural analysis for another load case, baking condition with incident, is conducted to find out which load is dominant for triangular support. Consequently, it is found that the baking condition is the dominant load for triangular support bracket. The proposed load combination method gives a physically reasonable solution which can be used as a reference for checking the validity of other thermal–structural analysis. It is expected that these results could be applied for manufacturing design of the triangular support under various load conditions

  12. Thermal–structural analysis of ITER triangular support for dominant load verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yu-Gyeong, E-mail: aspirany@hhi.co.kr [Hyundai Heavy Industries Co., Ltd., 1000, Bangeojinsunhwando-ro, Dong-gu, Ulsan (Korea, Republic of); Hwang, Jong-Hyun; Jung, Yung-Jin [Hyundai Heavy Industries Co., Ltd., 1000, Bangeojinsunhwando-ro, Dong-gu, Ulsan (Korea, Republic of); Kim, Hyun-Soo; Ahn, Hee-Jae [National Fusion Research Institute, 113 Gwahangno, Yuseong-gu, Daejeon-si (Korea, Republic of)

    2014-12-15

    Highlights: • The load combination method is introduced to thermal–structural analysis for contradictive loads occurred simultaneously. • The one-way coupling analysis also conducted for thermal–structural analysis and its validity is checked by comparing with the load combination. • The dominant load for triangular support bracket is determined as the baking condition. - Abstract: The triangular support is located on the lower inner shell of vacuum vessel of ITER, which should be designed to withstand various loads such as nuclear heat, coolant pressure and so on. The appropriateness of its design is evaluated under the dominant load that could represent the most conservative condition among the design loads. In order to decide the dominant load, a valid method for thermal–structural analysis is firstly verified considering contradictory behaviors between heat and structural loads. In this paper, two approaches; one-way coupling and load combination, are introduced for thermal–structural analysis. The one-way coupling is a method generally used but has a limit to apply on contradictory conditions. The load combination could give a proper solution since it evaluates each load independently and then adds up each result linearly. Based on the results of each case, structural analysis for another load case, baking condition with incident, is conducted to find out which load is dominant for triangular support. Consequently, it is found that the baking condition is the dominant load for triangular support bracket. The proposed load combination method gives a physically reasonable solution which can be used as a reference for checking the validity of other thermal–structural analysis. It is expected that these results could be applied for manufacturing design of the triangular support under various load conditions.

  13. The SPH homogeneization method

    International Nuclear Information System (INIS)

    Kavenoky, Alain

    1978-01-01

    The homogeneization of a uniform lattice is a rather well understood topic while difficult problems arise if the lattice becomes irregular. The SPH homogeneization method is an attempt to generate homogeneized cross sections for an irregular lattice. Section 1 summarizes the treatment of an isolated cylindrical cell with an entering surface current (in one velocity theory); Section 2 is devoted to the extension of the SPH method to assembly problems. Finally Section 3 presents the generalisation to general multigroup problems. Numerical results are obtained for a PXR rod bundle assembly in Section 4

  14. Homogeneity of Inorganic Glasses

    DEFF Research Database (Denmark)

    Jensen, Martin; Zhang, L.; Keding, Ralf

    2011-01-01

    Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...

  15. Experiment and analysis of CASTOR type model cask for verification of radiation shielding

    Energy Technology Data Exchange (ETDEWEB)

    Hattori, Seiichi; Ueki, Kohtaro.

    1988-08-01

    The radiation shielding system of CASTOR type cask is composed of the graphite cast iron and the polyethylene lod. The former fomes the cylndrical body of the cask to shield gamma rays and the latter is embeded in the body to shield neutrons. Characteristic of radiation shielding of CASTOR type cask is that zigzag arrangement of the polyethylene lod is adopted to unify the penetrating dose rate. It is necessary to use the three-dimensional analysis code to analyse the shielding performance of the cask with the complicated shielding system precisely. However, it takes too much time as well as too much cost. Therefore, the two-dimensional analysis is usually applied, in which the three-dimensional model is equivalently transformed into the two-dimensional calculation. The reseach study was conducted to verify the application of the two-dimensional analysis, in which the experiment and the analysis using CASTOR type model cask was perfomed. The model cask was manufactured by GNS campany in West Germany and the shielding ability test facilities in CRIEPI were used. It was judged from the study that the two-dimensional analysis is useful means for the practical use.

  16. Homogeneity testing and quantitative analysis of manganese (Mn in vitrified Mn-doped glasses by laser-induced breakdown spectroscopy (LIBS

    Directory of Open Access Journals (Sweden)

    V. K. Unnikrishnan

    2014-09-01

    Full Text Available Laser-induced breakdown spectroscopy (LIBS, an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×109 W/cm2. The spatially integrated plasma emission was collected and imaged on to the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.

  17. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  18. Verification of hybrid analysis concept of soil-foundation interaction by field vibration tests. Pt. 2

    International Nuclear Information System (INIS)

    Katayama, I.; Niwa, A.; Kubo, Y.; Penzien, J.

    1987-01-01

    The paper describes the outline of the hybrid analysis code for soil-structure interaction (HASSI) and the results of numerical simulation of the responses obtained at the model 2C in both cases of the forced vibration test and the natural earthquake excitation. (orig./HP)

  19. Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)

    DEFF Research Database (Denmark)

    2014-01-01

    -aware tool chains. We brought together 46 key researchers from three continents, working on quality assurance challenges that arise from introducing variability, and some who do not work with variability, but that are experts in their respective areas in the broader domain of software analysis or testing...

  20. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  1. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  2. Camera calibration in a hazardous environment performed in situ with automated analysis and verification

    International Nuclear Information System (INIS)

    DePiero, F.W.; Kress, R.L.

    1993-01-01

    Camera calibration using the method of Two Planes is discussed. An implementation of the technique is described that may be performed in situ, e.g., in a hazardous or contaminated environment, thus eliminating the need for decontamination of camera systems before recalibration. Companion analysis techniques used for verifying the correctness of the calibration are presented

  3. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  4. Computational combustion and emission analysis of hydrogen-diesel blends with experimental verification

    International Nuclear Information System (INIS)

    Masood, M.; Ishrat, M.M.; Reddy, A.S.

    2007-01-01

    The paper discusses the effect of blending hydrogen with diesel in different proportions on combustion and emissions. A comparative study was carried out to analyze the effect of direct injection of hydrogen into the combustion chamber with that of induction through the inlet manifold for dual fueling. Percentage of hydrogen substitution varied from 20% to 80%, simultaneously reducing the diesel percentages. CFD analysis of dual fuel combustion and emissions were carried out for both the said methods using the CFD software FLUENT, meshing the combustion chamber was carried out using GAMBIT. The standard combustion and emission models were used in the analysis. In the second part of the paper, the effect of angle of injection in both the methods of hydrogen admission, on performance, combustion and emissions were analyzed. The experimental results were compared with that of simulated values and a good agreement between them was noticed. (author)

  5. Verification of hybrid analysis concept of soil-foundation interaction by field vibration tests - Analytical phase

    International Nuclear Information System (INIS)

    Katayama, I.; Niwa, A.; Kubo, Y.; Penzien, J.

    1987-01-01

    In connection with the previous paper under the same subject, which describes the results obtained by the field vibration tests of five different models, this paper describes the outline of the hybrid analysis code of soil-structure interaction (HASSI) and the results of numerical simulation of the responses obtained at the model 2C in both cases of the forced vibration test and the natural earthquake excitation

  6. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  7. In silico analysis and verification of S100 gene expression in gastric cancer

    International Nuclear Information System (INIS)

    Liu, Ji; Li, Xue; Dong, Guang-Long; Zhang, Hong-Wei; Chen, Dong-Li; Du, Jian-Jun; Zheng, Jian-Yong; Li, Ji-Peng; Wang, Wei-Zhong

    2008-01-01

    The S100 protein family comprises 22 members whose protein sequences encompass at least one EF-hand Ca 2+ binding motif. They were involved in the regulation of a number of cellular processes such as cell cycle progression and differentiation. However, the expression status of S100 family members in gastric cancer was not known yet. Combined with analysis of series analysis of gene expression, virtual Northern blot and microarray data, the expression levels of S100 family members in normal and malignant stomach tissues were systematically investigated. The expression of S100A3 was further evaluated by quantitative RT-PCR. At least 5 S100 genes were found to be upregulated in gastric cance by in silico analysis. Among them, four genes, including S100A2, S100A4, S100A7 and S100A10, were reported to overexpressed in gastric cancer previously. The expression of S100A3 in eighty patients of gastric cancer was further examined. The results showed that the mean expression levels of S100A3 in gastric cancer tissues were 2.5 times as high as in adjacent non-tumorous tissues. S100A3 expression was correlated with tumor differentiation and TNM (Tumor-Node-Metastasis) stage of gastric cancer, which was relatively highly expressed in poorly differentiated and advanced gastric cancer tissues (P < 0.05). To our knowledge this is the first report of systematic evaluation of S100 gene expressions in gastric cancers by multiple in silico analysis. The results indicated that overexpression of S100 gene family members were characteristics of gastric cancers and S100A3 might play important roles in differentiation and progression of gastric cancer

  8. Fault Tree Analysis for Safety/Security Verification in Aviation Software

    Directory of Open Access Journals (Sweden)

    Andrew J. Kornecki

    2013-01-01

    Full Text Available The Next Generation Air Traffic Management system (NextGen is a blueprint of the future National Airspace System. Supporting NextGen is a nation-wide Aviation Simulation Network (ASN, which allows integration of a variety of real-time simulations to facilitate development and validation of the NextGen software by simulating a wide range of operational scenarios. The ASN system is an environment, including both simulated and human-in-the-loop real-life components (pilots and air traffic controllers. Real Time Distributed Simulation (RTDS developed at Embry Riddle Aeronautical University, a suite of applications providing low and medium fidelity en-route simulation capabilities, is one of the simulations contributing to the ASN. To support the interconnectivity with the ASN, we designed and implemented a dedicated gateway acting as an intermediary, providing logic for two-way communication and transfer messages between RTDS and ASN and storage for the exchanged data. It has been necessary to develop and analyze safety/security requirements for the gateway software based on analysis of system assets, hazards, threats and attacks related to ultimate real-life future implementation. Due to the nature of the system, the focus was placed on communication security and the related safety of the impacted aircraft in the simulation scenario. To support development of safety/security requirements, a well-established fault tree analysis technique was used. This fault tree model-based analysis, supported by a commercial tool, was a foundation to propose mitigations assuring the gateway system safety and security. 

  9. Verification of radiation heat transfer analysis in KSTAR PFC and vacuum vessel during baking

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, S.Y. [Chungnam National University, 79 Daehak-ro, Yuseong-gu, Daejeon 34167 (Korea, Republic of); Kim, Y.J., E-mail: k43689@nfri.re.kr [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of); Kim, S.T.; Jung, N.Y.; Im, D.S.; Gong, J.D.; Lee, J.M.; Park, K.R.; Oh, Y.K. [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of)

    2016-11-01

    Highlights: • Thermal network is used to analyze heat transfer from PFC to VV. • Three heat transfer rate equations are derived based on the thermal network. • The equations is verified using Experimental data and design documents. • Most of the heat lost in tokamak is transferred to experimental room air. • The heat loss to the air is 101 kW of the total heat loss of 154 kW in tokamak. - Abstract: KSTAR PFC (Plasma Facing Component) and VV (Vacuum Vessel) were not arrived at the target temperatures in bake-out phase, which are 300 °C and 110 °C, respectively. The purpose of this study is to find out the reason why they have not been reached the target temperature. A thermal network analysis is used to investigate the radiation heat transfer from PFC to VV, and the thermal network is drawn up based on the actual KSTAR tokamak. The analysis model consists of three equations, and is solved using the EES (Engineering Equation Solver). The heat transfer rates obtained with the analysis model is verified using the experimental data at the KSTAR bake-out phase. The analyzed radiation heat transfer rates from PFC to VV agree quite well with those of experiment throughout the bake-out phase. Heat loss from PFC to experimental room air via flange of VV is also calculated and compared, which is found be the main reason of temperature gap between the target temperature and actually attained temperature of KSTAR PFC.

  10. Verification of radiation heat transfer analysis in KSTAR PFC and vacuum vessel during baking

    International Nuclear Information System (INIS)

    Yoo, S.Y.; Kim, Y.J.; Kim, S.T.; Jung, N.Y.; Im, D.S.; Gong, J.D.; Lee, J.M.; Park, K.R.; Oh, Y.K.

    2016-01-01

    Highlights: • Thermal network is used to analyze heat transfer from PFC to VV. • Three heat transfer rate equations are derived based on the thermal network. • The equations is verified using Experimental data and design documents. • Most of the heat lost in tokamak is transferred to experimental room air. • The heat loss to the air is 101 kW of the total heat loss of 154 kW in tokamak. - Abstract: KSTAR PFC (Plasma Facing Component) and VV (Vacuum Vessel) were not arrived at the target temperatures in bake-out phase, which are 300 °C and 110 °C, respectively. The purpose of this study is to find out the reason why they have not been reached the target temperature. A thermal network analysis is used to investigate the radiation heat transfer from PFC to VV, and the thermal network is drawn up based on the actual KSTAR tokamak. The analysis model consists of three equations, and is solved using the EES (Engineering Equation Solver). The heat transfer rates obtained with the analysis model is verified using the experimental data at the KSTAR bake-out phase. The analyzed radiation heat transfer rates from PFC to VV agree quite well with those of experiment throughout the bake-out phase. Heat loss from PFC to experimental room air via flange of VV is also calculated and compared, which is found be the main reason of temperature gap between the target temperature and actually attained temperature of KSTAR PFC.

  11. Slideline verification for multilayer pressure vessel and piping analysis including tangential motion

    International Nuclear Information System (INIS)

    Van Gulick, L.A.

    1984-01-01

    Nonlinear finite element method (FEM) computer codes with slideline algorithm implementations should be useful for the analysis of prestressed multilayer pressure vessels and piping. This paper presents closed form solutions including the effects of tangential motion useful for verifying slideline implementations for this purpose. The solutions describe stresses and displacements of a long internally pressurized elastic-plastic cylinder initially separated from an elastic outer cylinder by a uniform gap. Comparison of closed form and FEM results evaluates the usefulness of the closed form solution and the validity of the sideline implementation used

  12. Verification of Serpent code for the fuel analysis of a PBMR

    International Nuclear Information System (INIS)

    Bastida O, G. E.; Francois L, J. L.

    2015-09-01

    In this paper the models and simulations with the Monte Carlo code Serpent are presented, as well as the obtained results of the different analyzed cases in order to verify the suitability or reliability of the use of this code to ensure favorable results in the realization of a neutronic analysis of fuel for a Pebble Bed Modular Reactor (PBMR). Comparisons were made with the results reported in a report by the OECD/Nea relative to a high temperature reactor of spheres bed with plutonium reactor grade as fuel. The results show that the use of Serpent is appropriate, as these results are comparable with those reported in the report. (Author)

  13. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  14. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  15. Analysis, Verification, and Application of Equations and Procedures for Design of Exhaust-pipe Shrouds

    Science.gov (United States)

    Ellerbrock, Herman H.; Wcislo, Chester R.; Dexter, Howard E.

    1947-01-01

    Investigations were made to develop a simplified method for designing exhaust-pipe shrouds to provide desired or maximum cooling of exhaust installations. Analysis of heat exchange and pressure drop of an adequate exhaust-pipe shroud system requires equations for predicting design temperatures and pressure drop on cooling air side of system. Present experiments derive such equations for usual straight annular exhaust-pipe shroud systems for both parallel flow and counter flow. Equations and methods presented are believed to be applicable under certain conditions to the design of shrouds for tail pipes of jet engines.

  16. Thermal design, analysis, and experimental verification for a DIII-D cryogenic pump

    International Nuclear Information System (INIS)

    Baxi, C.B.; Anderson, P.; Langhorn, A.; Schaubel, K.; Smith, J.

    1991-01-01

    As part of the advanced divertor program, it is planned to install a 50 m 3 /s capacity cryopump for particle removal in the DIII-D tokamak. The cryopump will be located in the outer bottom corner of the vacuum vessel. The pump will consist of a surface at liquid helium temperature (helium panel) with a surface area of about 1 m 2 , a surface at liquid nitrogen temperature (nitrogen shield) to reduce radiation heat load on the helium panel, and a secondary shield around the nitrogen shield. The cryopump design poses a number of thermal hydraulic problems such as estimation of heat loads on helium and nitrogen panels, stability of the two-phase helium flow, performance of the pump components during high temperature bakeout, and cooldown performance of the helium panel from ambient temperatures. This paper presents the thermal analysis done to resolve these issues. A prototypic experiment performed at General Atomics verified the analysis and increased the confidence in the design. The experimental results are also summarized in this paper. (orig.)

  17. Thermal design, analysis, and experimental verification for a DIII-D cryogenic pump

    International Nuclear Information System (INIS)

    Baxi, C.B.; Anderson, P.; Langhorn, A.; Schaubel, K.; Smith, J.

    1991-08-01

    As part of the advanced divertor program, it is planned to install a 50 m 3 /s capacity cryopump for particle removal in the D3-D tokamak. The cryopump will be located in the outer bottom corner of the vacuum vessel. The pump will consist of a surface at liquid helium temperature (helium panel) with a surface area of about 1 m 2 , a surface at liquid nitrogen temperature (nitrogen shield) to reduce radiation heat load on the helium panel, and a secondary shield around the nitrogen shield. The cryopump design poses a number of thermal hydraulic problems such as estimation of heat loads on helium and nitrogen panels, stability of the two-phase helium flow, performance of the pump components during high temperature bakeout, and cooldown performance of the helium panel from ambient temperatures. This paper presents the thermal analysis done to resolve these issues. A prototypic experiment performed at General Atomics verified the analysis and increased the confidence in the design. The experimental results are also summarized in this paper. 7 refs., 5 figs., 1 tab

  18. The use of the hybrid K-edge densitometer for routine analysis of safeguards verification samples of reprocessing input liquor

    International Nuclear Information System (INIS)

    Ottmar, H.; Eberle, H.

    1991-01-01

    Following successful tests of a hybrid K-edge instrument at TUI Karlsruhe and the routine use of a K-edge densitometer for safeguards verification at the same laboratory, the Euratom Safeguards Directorate of the Commission of the European Communities decided to install the first such instrument into a large industrial reprocessing plant for the routine verification of samples taken from the input accountancy tanks. This paper reports on the installation, calibration, sample handling procedure and the performance of this instrument after one year of routine operation

  19. Computer model verification for seismic analysis of vertical pumps and motors

    International Nuclear Information System (INIS)

    McDonald, C.K.

    1993-01-01

    The general principles of modeling vertical pumps and motors are discussed and then two examples of verifying the models are presented in detail. The first examples is a vertical pump and motor assembly. The model and computer analysis are presented and the first four modes (frequencies) calculated are compared to the values of the same modes obtained from a shaker test. The model used for this example is a lumped mass connected by massless beams model. The shaker test was performed by National Technical Services, Los Angeles, CA. The second example is a larger vertical motor. The model used for this example is a finite element three dimensional shell model. The first frequency obtained from this model is compared to the first frequency obtained from shop tests for several different motors. The shop tests were performed by Reliance Electric, Stratford, Ontario and Siemens-Allis, Inc., Norwood, Ohio

  20. Radionuclide analysis and scaling factors verification for LLRW of Taipower Reactor

    International Nuclear Information System (INIS)

    King, J.-Y.; Liu, K.-T.; Chen, S.-C.; Chang, T.-M.; Pung, T.-C.; Men, L.-C.; Wang, S.-J.

    2004-01-01

    The Atomic Energy Council of the Republic of China (CAEC) final disposal policy for Low Level Radwaste (LLRW) will be carried on in 1996. Institute of Nuclear Energy Research has the contract to develop the Radionuclide analysis method and to establish the scaling factors for LLRW of Taipower reactors. The radionuclides analyzed including: Co-60, Cs-137, Ce-144, γ-nuclides; H-3, C-14, Fe-55, Ni-59, Ni-63, Sr-90, Nb-94, Tc-99, I-129, Pu-238, Pu-239/240, Pu-241, Am-241, Cm-242, Cm-244 α, β and low energy γ nuclides. 120 samples taken from 21 waste streams were analyzed and the database was collected within 2 years. The scaling factors for different kind of waste streams were computed with weighted log-mean average method. In 1993, the scaling factors for each waste stream has been verified through actual station samples. (author)

  1. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    International Nuclear Information System (INIS)

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations

  2. Control analysis and experimental verification of a practical dc–dc boost converter

    Directory of Open Access Journals (Sweden)

    Saswati Swapna Dash

    2015-12-01

    Full Text Available This paper presents detailed open loop and closed loop analysis on boost dc–dc converter for both voltage mode control and current mode control. Here the boost dc–dc converter is a practical converter considering all possible parasitic elements like ESR and on state voltage drops. The open loop control, closed loop current mode control and voltage mode control are verified. The comparative study of all control techniques is presented. The PI compensator for closed loop current mode control is designed using these classical techniques like root locus technique and bode diagram. The simulation results are validated with the experimental results of voltage mode control for both open loop and closed loop control.

  3. Verification and sensitivity analysis on the elastic stiffness of the leaf type holddown spring assembly

    International Nuclear Information System (INIS)

    Song, Kee Nam

    1998-01-01

    The elastic formula of leaf type hold down spring(HDS) assembly is verified by comparing the values of elastic stiffness with the characteristic test results of the HDS's specimens. The comparisons show that the derived elastic stiffness formula is useful in reliably estimating the elastic stiffness of leaf type HDS assembly. The elastic stiffness sensitivity of leaf type HDS assembly is analyzed using the formula and its gradient vectors obtained from the mid-point formula. As a result of sensitivity analysis, the elastic stiffness sensitivity with respect to each design variable is quantified and design variables of large sensitivity are identified. Among the design variables, leaf thickness is identified as the most sensitive design variable to the elastic of leaf type HDS assembly. In addition, the elastic stiffness sensitivity, with respect to design variable, is in power-law type correlation to the base thickness of the leaf. (author)

  4. Logic analysis and verification of n-input genetic logic circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2017-01-01

    . In this paper, we present an approach to analyze and verify the Boolean logic of a genetic circuit from the data obtained through stochastic analog circuit simulations. The usefulness of this analysis is demonstrated through different case studies illustrating how our approach can be used to verify the expected......Nature is using genetic logic circuits to regulate the fundamental processes of life. These genetic logic circuits are triggered by a combination of external signals, such as chemicals, proteins, light and temperature, to emit signals to control other gene expressions or metabolic pathways...... accordingly. As compared to electronic circuits, genetic circuits exhibit stochastic behavior and do not always behave as intended. Therefore, there is a growing interest in being able to analyze and verify the logical behavior of a genetic circuit model, prior to its physical implementation in a laboratory...

  5. Verification and validation of one-dimensional models used in subcooled flow boiling analysis

    International Nuclear Information System (INIS)

    Braz Filho, Francisco A.; Caldeira, Alexandre D.; Borges, Eduardo M.; Sabundjian, Gaiane

    2009-01-01

    Subcooled flow boiling occurs in many industrial applications and it is characterized by large heat transfer coefficients. However, this efficient heat transfer mechanism is limited by the critical heat flux, where the heat transfer coefficient decreases leading to a fast heater temperature excursion, potentially leading to heater melting and destruction. Subcooled flow boiling is especially important in water-cooled nuclear power reactors, where the presence of vapor bubbles in the core influences the reactor system behavior at operating and accident conditions. With the aim of verifying the subcooled flow boiling calculation models of the most important nuclear reactor thermal-hydraulic computer codes, such as RELAP5, COBRA-EN and COTHA-2tp, the main purpose of this work is to compare experimental data with results from these codes in the pressure range between 15 and 45 bar. For the pressure of 45 bar the results are in good agreement, while for low pressures (15 and 30 bar) the results start to become conflicting. Besides, as a sub-product of this analysis, a comparison among the models is also presented. (author)

  6. Verification of fire and explosion accident analysis codes (facility design and preliminary results)

    International Nuclear Information System (INIS)

    Gregory, W.S.; Nichols, B.D.; Talbott, D.V.; Smith, P.R.; Fenton, D.L.

    1985-01-01

    For several years, the US Nuclear Regulatory Commission has sponsored the development of methods for improving capabilities to analyze the effects of postulated accidents in nuclear facilities; the accidents of interest are those that could occur during nuclear materials handling. At the Los Alamos National Laboratory, this program has resulted in three computer codes: FIRAC, EXPAC, and TORAC. These codes are designed to predict the effects of fires, explosions, and tornadoes in nuclear facilities. Particular emphasis is placed on the movement of airborne radioactive material through the gaseous effluent treatment system of a nuclear installation. The design, construction, and calibration of an experimental ventilation system to verify the fire and explosion accident analysis codes are described. The facility features a large industrial heater and several aerosol smoke generators that are used to simulate fires. Both injected thermal energy and aerosol mass can be controlled using this equipment. Explosions are simulated with H 2 /O 2 balloons and small explosive charges. Experimental measurements of temperature, energy, aerosol release rates, smoke concentration, and mass accumulation on HEPA filters can be made. Volumetric flow rate and differential pressures also are monitored. The initial experiments involve varying parameters such as thermal and aerosol rate and ventilation flow rate. FIRAC prediction results are presented. 10 figs

  7. Seismic analysis methods for LMFBR core and verification with mock-up vibration tests

    International Nuclear Information System (INIS)

    Sasaki, Y.; Kobayashi, T.; Fujimoto, S.

    1988-01-01

    This paper deals with the vibration behaviors of a cluster of core elements with the hexagonal cross section in a barrel under the dynamic excitation due to seismic events. When a strong earthquake excitation is applied to the core support, the cluster of core elements displace to a geometrical limit determined by restraint rings in the barrel, and collisions could occur between adjacent elements as a result of their relative motion. For these reasons, seismic analysis on LMFBR core elements is a complicated non-linear vibration problem, which includes collisions and fluid interactions. In an actual core design, it is hard to include hundreds of elements in the numerical calculations. In order to study the seismic behaviors of core elements, experiments with single row 29 elements (17 core fuel assemblies, 4 radial blanket assemblies, and 8 neutron shield assemblies) simulated all elements in MONJU core central row, and experiments with 7 cluster rows of 37 core fuel assemblies in the core center were performed in a fluid filled tank, using a large-sized shaking table. Moreover, the numerical analyses of these experiments were performed for the validation of simplified and detailed analytical methods. 4 refs, 18 figs

  8. An analysis of depressive symptoms in stroke survivors: verification of a moderating effect of demographic characteristics.

    Science.gov (United States)

    Park, Eun-Young; Kim, Jung-Hee

    2017-04-08

    The rehabilitation of depressed stroke patients is more difficult because poststroke depression is associated with disruption of daily activities, functioning, and quality of life. However, research on depression in stroke patients is limited. The aim of our study was to evaluate the interaction of demographic characteristics including gender, age, education level, the presence of a spouse, and income status on depressive symptoms in stroke patients and to identify groups that may need more attention with respect to depressive symptoms. We completed a secondary data analysis using data from a completed cross-sectional study of people with stroke. Depression was measured using the Center for Epidemiologic Studies Depression Scale. In this study, depressive symptoms in women living with a spouse were less severe than among those without a spouse. For those with insufficient income, depressive symptom scores were higher in the above high school group than in the below high school group, but were lower in patients who were living with a spouse than in those living without a spouse. Assessing depressive symptoms after stroke should consider the interaction of gender, economic status, education level, and the presence/absence of a spouse. These results would help in comprehensive understanding of the importance of screening for and treating depressive symptoms during rehabilitation after stroke.

  9. Kinematic analysis and experimental verification of a eccentric wheel based precision alignment mechanism for LINAC

    International Nuclear Information System (INIS)

    Mundra, G.; Jain, V.; Singh, K.K.; Saxena, P.; Khare, R.K.; Bagre, M.

    2011-01-01

    Eccentric wheel based precision alignment system was designed for the remote motorized alignment of proposed proton injector LINAC (SFDTL). As a part of the further development for the alignment and monitoring scheme, a menu driven alignment system is being developed. The paper describes a general kinematic equation (with base line tilt correction) based on the various parameters of the mechanism like eccentricity, wheel diameter, distance between the wheels and the diameter of the cylindrical accelerator component. Based on this equation the extent of the alignment range for the 4 degree of freedom is evaluated and analysis on some of the parameters variation and the theoretical accuracy/resolution is computed. For the same a computer program is written which can compute the various points for the each discrete position of the two motor combinations. The paper also describes the experimentally evaluated values of these positions (for the full extent of area) and the matching/comparison of the two data. These data now can be used for the movement computation required for alignment of the four motors (two front and two rear motors of the support structure). (author)

  10. Analysis and experimental verification of a control scheme for unified power quality conditioner

    Energy Technology Data Exchange (ETDEWEB)

    Peng Cheng Zhu; Xun Li; Yong Kang; Jian Chen [Huazhong Univ. of Science and Techmnology, Wuhan (China). Dept. of Electrical Engineering

    2005-07-01

    Improving power quality for sensitive load by a Unified Power Quality Conditioner (UPQC) in a distributed generation system is presented in this paper. The power balance of a UPQC, consisting of back-to-back connected series and shunt Active Filters (AF), is analysed. Based on the analysis a novel control scheme is established in a 2-phase Synchronous Rotating d-q Frame (SRF). In this control scheme, the series AF is controlled as a current source and makes the input current sinusoidal, while the shunt AF is controlled as a voltage source and keeps the load voltage in the normal value. With the proposed control strategy, the UPQC is capable of compensating not only harmonic and reactive currents of the load but also grid voltage distortion. There is no harmonic interference between harmonic-producing loads and harmonic-sensitive loads, which are connected on the common bus. The performance of a UPQC with the proposed control scheme under nonlinear load and grid voltage distortion is investigated with simulation as well as experimental works. (Author)

  11. Thermal Analysis of MIRIS Space Observation Camera for Verification of Passive Cooling

    Directory of Open Access Journals (Sweden)

    Duk-Hang Lee

    2012-09-01

    Full Text Available We conducted thermal analyses and cooling tests of the space observation camera (SOC of the multi-purpose infrared imaging system (MIRIS to verify passive cooling. The thermal analyses were conducted with NX 7.0 TMG for two cases of attitude of the MIRIS: for the worst hot case and normal case. Through the thermal analyses of the flight model, it was found that even in the worst case the telescope could be cooled to less than 206°K. This is similar to the results of the passive cooling test (~200.2°K. For the normal attitude case of the analysis, on the other hand, the SOC telescope was cooled to about 160°K in 10 days. Based on the results of these analyses and the test, it was determined that the telescope of the MIRIS SOC could be successfully cooled to below 200°K with passive cooling. The SOC is, therefore, expected to have optimal performance under cooled conditions in orbit.

  12. New approach to accuracy verification of 3D surface models: An analysis of point cloud coordinates.

    Science.gov (United States)

    Lee, Wan-Sun; Park, Jong-Kyoung; Kim, Ji-Hwan; Kim, Hae-Young; Kim, Woong-Chul; Yu, Chin-Ho

    2016-04-01

    The precision of two types of surface digitization devices, i.e., a contact probe scanner and an optical scanner, and the trueness of two types of stone replicas, i.e., one without an imaging powder (SR/NP) and one with an imaging powder (SR/P), were evaluated using a computer-aided analysis. A master die was fabricated from stainless steel. Ten impressions were taken, and ten stone replicas were prepared from Type IV stone (Fujirock EP, GC, Leuven, Belgium). The precision of two types of scanners was analyzed using the root mean square (RMS), measurement error (ME), and limits of agreement (LoA) at each coordinate. The trueness of the stone replicas was evaluated using the total deviation. A Student's t-test was applied to compare the discrepancies between the CAD-reference-models of the master die (m-CRM) and point clouds for the two types of stone replicas (α=.05). The RMS values for the precision were 1.58, 1.28, and 0.98μm along the x-, y-, and z-axes in the contact probe scanner and 1.97, 1.32, and 1.33μm along the x-, y-, and z-axes in the optical scanner, respectively. A comparison with m-CRM revealed a trueness of 7.10μm for SR/NP and 8.65μm for SR/P. The precision at each coordinate (x-, y-, and z-axes) was revealed to be higher than the one assessed in the previous method (overall offset differences). A comparison between the m-CRM and 3D surface models of the stone replicas revealed a greater dimensional change in SR/P than in SR/NP. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  13. Synthetic spider silk sustainability verification by techno-economic and life cycle analysis

    Science.gov (United States)

    Edlund, Alan

    Major ampullate spider silk represents a promising biomaterial with diverse commercial potential ranging from textiles to medical devices due to the excellent physical and thermal properties from the protein structure. Recent advancements in synthetic biology have facilitated the development of recombinant spider silk proteins from Escherichia coli (E. coli), alfalfa, and goats. This study specifically investigates the economic feasibility and environmental impact of synthetic spider silk manufacturing. Pilot scale data was used to validate an engineering process model that includes all of the required sub-processing steps for synthetic fiber manufacture: production, harvesting, purification, drying, and spinning. Modeling was constructed modularly to support assessment of alternative protein production methods (alfalfa and goats) as well as alternative down-stream processing technologies. The techno-economic analysis indicates a minimum sale price from pioneer and optimized E. coli plants at 761 kg-1 and 23 kg-1 with greenhouse gas emissions of 572 kg CO2-eq. kg-1 and 55 kg CO2-eq. kg-1, respectively. Spider silk sale price estimates from goat pioneer and optimized results are 730 kg-1 and 54 kg-1, respectively, with pioneer and optimized alfalfa plants are 207 kg-1 and 9.22 kg-1 respectively. Elevated costs and emissions from the pioneer plant can be directly tied to the high material consumption and low protein yield. Decreased production costs associated with the optimized plants include improved protein yield, process optimization, and an Nth plant assumption. Discussion focuses on the commercial potential of spider silk, the production performance requirements for commercialization, and impact of alternative technologies on the sustainability of the system.

  14. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  15. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  16. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  17. Dynamic model based novel findings in power systems analysis and frequency measurement verification

    Science.gov (United States)

    Kook, Kyung Soo

    power system engineering and, for doing this, new models and better application of the simulation should be proposed. Conducting extensive simulation studies, this dissertation verified that the actual X/R ratio of the bulk power systems is much lower than what has been known as its typical value, showed the effectiveness of the ESS control to mitigate the intermittence of the wind power from the perspective of the power grid using the newly proposed simulation model of ESS connected to the wind power, and found many characteristics of the wide-area frequency wave propagation. Also the possibility of using the simulated responses of the power system for replacing the measured data could be confirmed and this is very promising to the future application of the simulation to the on-line analysis of the power systems based on the FNET measurements.

  18. Altered pattern of spontaneous brain activity in the patients with end-stage renal disease: a resting-state functional MRI study with regional homogeneity analysis.

    Directory of Open Access Journals (Sweden)

    Xue Liang

    Full Text Available PURPOSE: To investigate the pattern of spontaneous neural activity in patients with end-stage renal disease (ESRD with and without neurocognitive dysfunction using resting-state functional magnetic resonance imaging (rs-fMRI with a regional homogeneity (ReHo algorithm. MATERIALS AND METHODS: rs-fMRI data were acquired in 36 ESRD patients (minimal nephro-encephalopathy [MNE], n = 19, 13 male, 37±12.07 years; non-nephro-encephalopathy [non-NE], n = 17, 11 male, 38±12.13 years and 20 healthy controls (13 male, 7 female, 36±10.27 years. Neuropsychological (number connection test type A [NCT-A], digit symbol test [DST] and laboratory tests were performed in all patients. The Kendall's coefficient of concordance (KCC was used to measure the regional homogeneity for each subject. The regional homogeneity maps were compared using ANOVA tests among MNE, non-NE, and healthy control groups and post hoc t -tests between each pair in a voxel-wise way. A multiple regression analysis was performed to evaluate the relationships between ReHo index and NCT-A, DST scores, serum creatinine and urea levels, disease and dialysis duration. RESULTS: Compared with healthy controls, both MNE and non-NE patients showed decreased ReHo in the multiple areas of bilateral frontal, parietal and temporal lobes. Compared with the non-NE, MNE patients showed decreased ReHo in the right inferior parietal lobe (IPL, medial frontal cortex (MFC and left precuneus (PCu. The NCT-A scores and serum urea levels of ESRD patients negatively correlated with ReHo values in the frontal and parietal lobes, while DST scores positively correlated with ReHo values in the bilateral PCC/precuneus, MFC and inferior parietal lobe (IPL (all P0.05, AlphaSim corrected. CONCLUSION: Diffused decreased ReHo values were found in both MNE and non-NE patients. The progressively decreased ReHo in the default mode network (DMN, frontal and parietal lobes might be trait-related in MNE. The Re

  19. Homogenized approach for the non linear dynamic analysis of entire masonry buildings by means of rigid plate elements and damaging interfaces

    Science.gov (United States)

    Bertolesi, Elisa; Milani, Gabriele

    2017-07-01

    The present paper is devoted to the analysis of entire 3D masonry structures adopting a Rigid Body and Spring-Mass (HRBSM) model. A series of non linear static and dynamic analyses are conducted with respect to two structures with technical relevance. The elementary cell is discretized by means of three-noded plane stress elements and non-linear interfaces. At a structural level, the non-linear analyses are performed replacing the homogenized orthotropic continuum with a rigid element and non-linear spring assemblage (RBSM) by means of which both in and out of plane mechanisms are allowed. In order to validate the proposed model for the analyses of full scale structures subjected to seismic actions, two different examples are critically discussed, namely a church façade and an in-scale masonry building, both subjected to dynamic excitation. The results obtained are compared with experimental or numerical results available in literature.

  20. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  2. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  3. Homogeneity analysis of high yield manufacturing process of mems-based pzt thick film vibrational energy harvesters

    DEFF Research Database (Denmark)

    Lei, Anders; Xu, Ruichao; Pedersen, C.M.

    2011-01-01

    This work presents a high yield wafer scale fabrication of MEMS-based unimorph silicon/PZT thick film vibrational energy harvesters aimed towards vibration sources with peak frequencies in the range of a few hundred Hz. By combining KOH etching with mechanical front side protection, SOI wafer...... to accurately define the thickness of the silicon part of the harvester and a silicon compatible PZT thick film screen-printing technique, we are able to fabricate energy harvesters on wafer scale with a yield higher than 90%. The characterization of the fabricated harvesters is focused towards the full wafer....../mass-production aspect; hence the analysis of uniformity in harvested power and resonant frequency....

  4. Review of the technical basis and verification of current analysis methods used to predict seismic response of spent fuel storage racks

    International Nuclear Information System (INIS)

    DeGrassi, G.

    1992-10-01

    This report presents the results of a literature review on spent fuel rack seismic analysis methods and modeling procedures. The analysis of the current generation of free standing high density spent fuel racks requires careful consideration of complex phenomena such as rigid body sliding and tilting motions; impacts between adjacent racks, between fuel assemblies and racks, and between racks and pool walls and floor; fluid coupling and frictional effects. The complexity of the potential seismic response of these systems raises questions regarding the levels of uncertainty and ranges of validity of the analytical results. BNL has undertaken a program to investigate and assess the strengths and weaknesses of current fuel rack seismic analysis methods. The first phase of this program involved a review of technical literature to identify the extent of experimental and analytical verification of the analysis methods and assumptions. Numerous papers describing analysis methods for free standing fuel racks were reviewed. However, the extent of experimental verification of these methods was found to be limited. Based on the information obtained from the literature review, the report provides an assessment of the significance of the issues of concern and makes recommendations for additional studies

  5. Dynamics of homogeneous nucleation

    DEFF Research Database (Denmark)

    Toxværd, Søren

    2015-01-01

    The classical nucleation theory for homogeneous nucleation is formulated as a theory for a density fluctuation in a supersaturated gas at a given temperature. But molecular dynamics simulations reveal that it is small cold clusters which initiates the nucleation. The temperature in the nucleating...

  6. Homogeneous bilateral block shifts

    Indian Academy of Sciences (India)

    Douglas class were classified in [3]; they are unilateral block shifts of arbitrary block size (i.e. dim H(n) can be anything). However, no examples of irreducible homogeneous bilateral block shifts of block size larger than 1 were known until now.

  7. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  8. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  9. Homogenization and Control of Lattice Structures

    National Research Council Canada - National Science Library

    Blankenship, G. L

    1985-01-01

    ...., trusses may be modeled by beam equations). Using a technique from the mathematics of asymptotic analysis called "homogenization," the author shows how such approximations may be derived in a systematic way that avoids errors made using...

  10. Nuclear-coupled thermal-hydraulic nonlinear stability analysis using a novel BWR reduced order model. Pt. 1. The effects of using drift flux versus homogeneous equilibrium models

    International Nuclear Information System (INIS)

    Dokhane, A.; Henning, D.; Chawla, R.; Rizwan-Uddin

    2003-01-01

    BWR stability analysis at PSI, as at other research centres, is usually carried out employing complex system codes. However, these do not allow a detailed investigation of the complete manifold of all possible solutions of the associated nonlinear differential equation set. A novel analytical, reduced order model for BWR stability has been developed at PSI, in several successive steps. In the first step, the thermal-hydraulic model was used for studying the thermal-hydraulic instabilities. A study was then conducted of the one-channel nuclear-coupled thermal-hydraulic dynamics in a BWR by adding a simple point kinetic model for neutron kinetics and a model for the fuel heat conduction dynamics. In this paper, a two-channel nuclear-coupled thermal-hydraulic model is introduced to simulate the out-of phase oscillations in a BWR. This model comprises three parts: spatial mode neutron kinetics with the fundamental and fist azimuthal modes; fuel heat conduction dynamics; and thermal-hydraulics model. This present model is an extension of the Karve et al. model i.e., a drift flux model is used instead of the homogeneous equilibrium model for two-phase flow, and lambda modes are used instead of the omega modes for the neutron kinetics. This two-channel model is employed in stability and bifurcation analyses, carried out using the bifurcation code BIFDD. The stability boundary (SB) and the nature of the Poincare-Andronov-Hopf bifurcation (PAF-B) are determined and visualized in a suitable two-dimensional parameter/state space. A comparative study of the homogeneous equilibrium model (HEM) and the drift flux model (DFM) is carried out to investigate the effects of the DFM parameters the void distribution parameter C 0 and the drift velocity V gi -on the SB, the nature of PAH bifurcation, and on the type of oscillation mode (in-phase or out-of-phase). (author)

  11. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  12. Development of a FBR fuel bundle-duct interaction analysis code-BAMBOO. Analysis model and verification by Phenix high burn-up fuel subassemblies

    International Nuclear Information System (INIS)

    Uwaba, Tomoyuki; Ito, Masahiro; Ukai, Shigeharu

    2005-01-01

    The bundle-duct interaction analysis code ''BAMBOO'' has been developed for the purpose of predicting deformation of a wire-wrapped fuel pin bundle of a fast breeder reactor (FBR). The BAMBOO code calculates helical bowing and oval-distortion of all the fuel pins in a fuel subassembly. We developed deformation models in order to precisely analyze the irradiation induced deformation by the code: a model to analyze fuel pin self-bowing induced by circumferential gradient of void swelling as well as thermal expansion, and a model to analyze dispersion of the orderly arrangement of a fuel pin bundle. We made deformation analyses of high burn-up fuel subassemblies in Phenix reactor and compared the calculated results with the post irradiation examination data of these subassemblies for the verification of these models. From the comparison we confirmed that the calculated values of the oval-distortion and bowing reasonably agreed with the PIE results if these models were used in the analysis of the code. (author)

  13. 3D DVH-based metric analysis versus per-beam planar analysis in IMRT pretreatment verification

    International Nuclear Information System (INIS)

    Carrasco, Pablo; Jornet, Núria; Latorre, Artur; Eudaldo, Teresa; Ruiz, Agustí; Ribas, Montserrat

    2012-01-01

    Purpose: To evaluate methods of pretreatment IMRT analysis, using real measurements performed with a commercial 2D detector array, for clinical relevance and accuracy by comparing clinical DVH parameters. Methods: We divided the work into two parts. The first part consisted of six in-phantom tests aimed to study the sensitivity of the different analysis methods. Beam fluences, 3D dose distribution, and DVH of an unaltered original plan were compared to those of the delivered plan, in which an error had been intentionally introduced. The second part consisted of comparing gamma analysis with DVH metrics for 17 patient plans from various sites. Beam fluences were measured with the MapCHECK 2 detector, per-beam planar analysis was performed with the MapCHECK software, and 3D gamma analysis and the DVH evaluation were performed using 3DVH software. Results: In a per-beam gamma analysis some of the tests yielded false positives or false negatives. However, the 3DVH software correctly described the DVH of the plan which included the error. The measured DVH from the plan with controlled error agreed with the planned DVH within 2% dose or 2% volume. We also found that a gamma criterion of 3%/3 mm was too lax to detect some of the forced errors. Global analysis masked some problems, while local analysis magnified irrelevant errors at low doses. Small hotspots were missed for all metrics due to the spatial resolution of the detector panel. DVH analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results, with the exception of very small volume structures such as the eyes and the lenses. Target coverage (D 98 and D 95 ) of the measured plan was systematically lower than that predicted by the treatment planning system, while other DVH characteristics varied depending on the parameter and organ. Conclusions: We found no correlation between the gamma index and the clinical impact of a discrepancy for any of the gamma index evaluation

  14. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2015-01-01

    Full Text Available Traditional means for identity validation (PIN codes, passwords, and physiological and behavioral biometric characteristics (fingerprint, iris, and speech are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI, II (rII, calculated from them first principal ECG component (rPCA, linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension has been considered. In addition a common reference PTB dataset (14 healthy individuals with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  15. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  16. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    International Nuclear Information System (INIS)

    Tachibana, H; Tachibana, R

    2015-01-01

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction

  17. Homogeneous group, research, institution

    Directory of Open Access Journals (Sweden)

    Francesca Natascia Vasta

    2014-09-01

    Full Text Available The work outlines the complex connection among empiric research, therapeutic programs and host institution. It is considered the current research state in Italy. Italian research field is analyzed and critic data are outlined: lack of results regarding both the therapeutic processes and the effectiveness of eating disorders group analytic treatment. The work investigates on an eating disorders homogeneous group, led into an eating disorder outpatient service. First we present the methodological steps the research is based on including the strong connection among theory and clinical tools. Secondly clinical tools are described and the results commented. Finally, our results suggest the necessity of validating some more specifical hypothesis: verifying the relationship between clinical improvement (sense of exclusion and painful emotions reduction and specific group therapeutic processes; verifying the relationship between depressive feelings, relapses and transition trough a more differentiated groupal field.Keywords: Homogeneous group; Eating disorders; Institutional field; Therapeutic outcome

  18. Homogen Mur - et udviklingsprojekt

    DEFF Research Database (Denmark)

    Dahl, Torben; Beim, Anne; Sørensen, Peter

    1997-01-01

    Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk.......Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk....

  19. Homogenization of resonant chiral metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....

  20. Quantitative Analysis of Homogeneous Electrocatalytic Reactions at IDA Electrodes: The Example of [Ni(PPh2NBn2)2]2+

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Fei; Parkinson, B. A.; Divan, Ralu; Roberts, John; Liang, Yanping

    2016-12-01

    Interdigitated array (IDA) electrodes have been applied to study the EC’ (electron transfer reaction followed by a catalytic reaction) reactions and a new method of quantitative analysis of IDA results was developed. In this new method, currents on IDA generator and collector electrodes for an EC’ mechanism are derived from the number of redox cycles and the contribution of non-catalytic current. And the fractions of bipotential recycling species and catalytic-active species are calculated, which helps understanding the catalytic reaction mechanism. The homogeneous hydrogen evolution reaction catalyzed by [Ni(PPh2NBn2)2]2+ (where PPh2NBn2 is 1,5-dibenzyl-3,7-diphenyl-1,5-diaza-3,7-diphosphacyclooctane) electrocatalyst was examined and analyzed with IDA electrodes. Besides, the existence of reaction intermediates in the catalytic cycle is inferred from the electrochemical behavior of a glassy carbon disk electrodes and carbon IDA electrodes. This quantitative analysis of IDA electrode cyclic voltammetry currents can be used as a simple and straightforward method for determining reaction mechanism in other catalytic systems as well.

  1. LSENS: A General Chemical Kinetics and Sensitivity Analysis Code for homogeneous gas-phase reactions. Part 1: Theory and numerical solution procedures

    Science.gov (United States)

    Radhakrishnan, Krishnan

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 1 of a series of three reference publications that describe LENS, provide a detailed guide to its usage, and present many example problems. Part 1 derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved. The accuracy and efficiency of LSENS are examined by means of various test problems, and comparisons with other methods and codes are presented. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.

  2. Quantitative Analysis of Homogeneous Electrocatalytic Reactions at IDA Electrodes: The Example of [Ni(PPh2NBn2)2]2+

    International Nuclear Information System (INIS)

    Liu, Fei; Parkinson, B.A.; Divan, Ralu; Roberts, John; Liang, Yanping

    2016-01-01

    Interdigitated array (IDA) electrodes have been applied to study the EC’ (electron transfer reaction followed by a catalytic reaction) reactions and a new method of quantitative analysis of IDA results was developed. In this new method, currents on IDA generator and collector electrodes for an EC’ mechanism are derived from the number of redox cycles and the contribution of non-catalytic current. And the fractions of bipotential recycling species and catalytic-active species are calculated, which helps understanding the catalytic reaction mechanism. The homogeneous hydrogen evolution reaction catalyzed by [Ni(P Ph 2 N Bn 2 ) 2 ] 2+ (where P Ph 2 N Bn 2 is 1,5-dibenzyl-3,7-diphenyl-1,5-diaza-3,7-diphosphacyclooctane) electrocatalyst was examined and analyzed with IDA electrodes. Besides, the existence of reaction intermediates in the catalytic cycle is inferred from the electrochemical behavior of a glassy carbon disk electrodes and carbon IDA electrodes. This quantitative analysis of IDA electrode cyclic voltammetry currents can be used as a simple and straightforward method for determining reaction mechanism in other catalytic systems as well.

  3. Homogeneous M2 duals

    International Nuclear Information System (INIS)

    Figueroa-O’Farrill, José; Ungureanu, Mara

    2016-01-01

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS 4 ×P 7 , with P riemannian and homogeneous under the action of SO(5), or S 4 ×Q 7 with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  4. Homogeneous M2 duals

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa-O’Farrill, José [School of Mathematics and Maxwell Institute for Mathematical Sciences,The University of Edinburgh,James Clerk Maxwell Building, The King’s Buildings, Peter Guthrie Tait Road,Edinburgh EH9 3FD, Scotland (United Kingdom); Ungureanu, Mara [Humboldt-Universität zu Berlin, Institut für Mathematik,Unter den Linden 6, 10099 Berlin (Germany)

    2016-01-25

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS{sub 4}×P{sup 7}, with P riemannian and homogeneous under the action of SO(5), or S{sup 4}×Q{sup 7} with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  5. Design for rock grouting based on analysis of grout penetration. Verification using Aespoe HRL data and parameter analysis

    International Nuclear Information System (INIS)

    Kobayashi, Shinji; Stille, Haakan

    2007-01-01

    Grouting as a method to reduce the inflow of water into underground facilities will be important in both the construction and operation of the deep repository. SKB has been studying grouting design based on characterization of fractured rock and prediction of grout spread. However, as in other Scandinavian tunnels, stop criteria have been empirically set so that grouting is completed when the grout flow is less than a certain value at maximum pressure or the grout take is above a certain value. Since empirically based stop criteria are determined without a theoretical basis and are not related to grout penetration, the grouting result may be inadequate or uneconomical. In order to permit the choice of adequate and cost-effective grouting methods, stop criteria can be designed based on a theoretical analysis of grout penetration. The relationship between grout penetration and grouting time has been studied at the Royal Institute of Technology and Chalmers University of Technology. Based on these studies, the theory has been further developed in order to apply to real grouting work. Another aspect is using the developed method for parameter analysis. The purpose of parameter analysis is to evaluate the influence of different grouting parameters on the result. Since the grouting strategy is composed of many different components, the selection of a grouting method is complex. Even if the theoretically most suitable grouting method is selected, it is difficult to carry out grouting exactly as planned because grouting parameters such as grout properties can easily vary during the grouting operation. In addition, knowing the parameters precisely beforehand is impossible because there are uncertainties inherent in the rock mass. Therefore, it is important to asses the effects of variations in grouting parameters. The parameter analysis can serve as a guide in choosing an effective grouting method. The objectives of this report are to: Further develop the theory concerning

  6. Analysis of the moments of the sensitivity function for resistivity over a homogeneous half-space: Rules of thumb for pseudoposition, offline sensitivity and resolution

    Science.gov (United States)

    Butler, S. L.

    2017-08-01

    It is instructive to consider the sensitivity function for a homogeneous half space for resistivity since it has a simple mathematical formula and it does not require a priori knowledge of the resistivity of the ground. Past analyses of this function have allowed visualization of the regions that contribute most to apparent resistivity measurements with given array configurations. The horizontally integrated form of this equation gives the sensitivity function for an infinitesimally thick horizontal slab with a small resistivity contrast and analysis of this function has admitted estimates of the depth of investigation for a given electrode array. Recently, it has been shown that the average of the vertical coordinate over this function yields a simple formula that can be used to estimate the depth of investigation. The sensitivity function for a vertical inline slab has also been previously calculated. In this contribution, I show that the sensitivity function for a homogeneous half-space can also be integrated so as to give sensitivity functions to semi-infinite vertical slabs that are perpendicular to the array axis. These horizontal sensitivity functions can, in turn, be integrated over the spatial coordinates to give the mean horizontal positions of the sensitivity functions. The mean horizontal positions give estimates for the centres of the regions that affect apparent resistivity measurements for arbitrary array configuration and can be used as horizontal positions when plotting pseudosections even for non-collinear arrays. The mean of the horizontal coordinate that is perpendicular to a collinear array also gives a simple formula for estimating the distance over which offline resistivity anomalies will have a significant effect. The root mean square (rms) widths of the sensitivity functions are also calculated in each of the coordinate directions as an estimate of the inverse of the resolution of a given array. For depth and in the direction perpendicular

  7. Progress in the analysis of non-axisymmetric wave propagation in a homogeneous solid circular cylinder of a piezoelectric transversely isotropic material

    CSIR Research Space (South Africa)

    Every, AG

    2010-01-01

    Full Text Available Non-axisymmetric waves in a free homogeneous piezoelectric cylinder of transversely isotropic material with axial polarization are investigated on the basis of the linear theory of elasticity and linear electromechanical coupling. The solution...

  8. HOMOGENEOUS NUCLEAR POWER REACTOR

    Science.gov (United States)

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  9. Homogeneous Finsler Spaces

    CERN Document Server

    Deng, Shaoqiang

    2012-01-01

    "Homogeneous Finsler Spaces" is the first book to emphasize the relationship between Lie groups and Finsler geometry, and the first to show the validity in using Lie theory for the study of Finsler geometry problems. This book contains a series of new results obtained by the author and collaborators during the last decade. The topic of Finsler geometry has developed rapidly in recent years. One of the main reasons for its surge in development is its use in many scientific fields, such as general relativity, mathematical biology, and phycology (study of algae). This monograph introduc

  10. Homogeneity spoil spectroscopy

    International Nuclear Information System (INIS)

    Hennig, J.; Boesch, C.; Martin, E.; Grutter, R.

    1987-01-01

    One of the problems of in vivo MR spectroscopy of P-31 is spectra localization. Surface coil spectroscopy, which is the method of choice for clinical applications, suffers from the high-intensity signal from subcutaneous muscle tissue, which masks the spectrum of interest from deeper structures. In order to suppress this signal while maintaining the simplicity of surface coil spectroscopy, the authors introduced a small sheet of ferromagnetically dotted plastic between the surface coil and the body. This sheet destroys locally the field homogeneity and therefore all signal from structures around the coil. The very high reproducibility of the simple experimental procedure allows long-term studies important for monitoring tumor therapy

  11. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  12. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    Science.gov (United States)

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  13. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  14. Spatial homogenization method based on the inverse problem

    International Nuclear Information System (INIS)

    Tóta, Ádám; Makai, Mihály

    2015-01-01

    Highlights: • We derive a spatial homogenization method in slab and cylindrical geometries. • The fluxes and the currents on the boundary are preserved. • The reaction rates and the integral of the fluxes are preserved. • We present verification computations utilizing two- and four-energy groups. - Abstract: We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region’s multi-group cross sections; providing that the fluxes, the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved. We consider one-dimensional geometries: a symmetric slab and a homogeneous cylinder. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined concerning the current and the flux integral. The first one derives the boundary currents from the boundary fluxes, while the second one derives the flux integrals from the boundary fluxes. Further RMs can be defined that connects reaction rates to the boundary fluxes. Assuming that these matrices are known, we present formulae that reconstruct the multi-group diffusion cross-section matrix, the diffusion coefficients and the reaction cross sections in case of one-dimensional (1D) homogeneous regions. We apply these formulae to 1D heterogeneous regions and thus obtain a homogenization method. This method produces such an equivalent homogeneous material, that the fluxes and the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved for any boundary fluxes. We carry out the exact derivations in 1D slab and cylindrical geometries. Verification computations for the presented homogenization method were performed using two- and four-group material cross sections, both in a slab and in a cylindrical geometry

  15. Verification of in-core thermal and hydraulic analysis code FLOWNET/TRUMP for the high temperature engineering test reactor (HTTR) at JAERI

    International Nuclear Information System (INIS)

    Maruyama, Soh; Sudo, Yukio; Saito, Shinzo; Kiso, Yoshihiro; Hayakawa, Hitoshi

    1989-01-01

    The FLOWNET/TRUMP code consists of a flow network analysis code 'FLOWNET' for calculations of coolant flow distribution and coolant temperature distribution in the core with a thermal conduction analysis code 'TRUMP' for calculation of temperature distribution in solid structures. The verification of FLOWNET/TRUMP was made by the comparison of the analytical results with the results of steady state experiments by the HENDEL multichannel test rig, T1-M, which consisted of twelve simulated fuel rods heated electrically and eleven hexagonal graphite fuel blocks. The T1-M simulated the one fuel column in the core. The analytical results agreed well with the results of the experiment in which the HTTR operating conditions were simulated. (orig.)

  16. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  17. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  18. Experimental analysis of volumetric wear behavioural and mechanical properties study of as cast and 1Hr homogenized Al-25Mg2Si2Cu4Ni alloy at constant load

    Science.gov (United States)

    Harlapur, M. D.; Mallapur, D. G.; Udupa, K. Rajendra

    2018-04-01

    In the current study, an experimental analysis of volumetric wear behaviour and mechanical properties of aluminium (Al-25Mg2Si2Cu4Ni) alloy in as cast and 1Hr homogenized with T6 heat treatment is carried out at constant load. Pin-on-disc apparatus was used to carry out sliding wear test. Mechanical properties such as tensile, hardness and compression test on as-cast and 1 hr homogenized samples are measured. Universal testing machine was used to conduct the tensile and compressive test at room temperature. Brinell hardness tester was used to conduct the hardness test. The scanning electron microscope was used to analyze the worn-out wear surfaces. Wear results and mechanical properties shows that 1Hr homogenized Al-25Mg2Si2Cu4Ni alloy samples with T6 treated had better volumetric wear resistance, hardness, tensile and compressive strength as compared to as cast samples.

  19. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  20. Homogeneous instantons in bigravity

    International Nuclear Information System (INIS)

    Zhang, Ying-li; Sasaki, Misao; Yeom, Dong-han

    2015-01-01

    We study homogeneous gravitational instantons, conventionally called the Hawking-Moss (HM) instantons, in bigravity theory. The HM instantons describe the amplitude of quantum tunneling from a false vacuum to the true vacuum. Corrections to General Relativity (GR) are found in a closed form. Using the result, we discuss the following two issues: reduction to the de Rham-Gabadadze-Tolley (dRGT) massive gravity and the possibility of preference for a large e-folding number in the context of the Hartle-Hawking (HH) no-boundary proposal. In particular, concerning the dRGT limit, it is found that the tunneling through the so-called self-accelerating branch is exponentially suppressed relative to the normal branch, and the probability becomes zero in the dRGT limit. As far as HM instantons are concerned, this could imply that the reduction from bigravity to the dRGT massive gravity is ill-defined.

  1. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  2. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  3. Debye series analysis of internal and near-surface fields for a homogeneous sphere illuminated by an axicon-generated vector Bessel beam

    International Nuclear Information System (INIS)

    Qin, Shitong; Li, Renxian; Yang, Ruiping; Ding, Chunying

    2017-01-01

    The interaction of an axicon-generated vector Bessel beam (AGVBB) with a homogeneous sphere is investigated in the framework of generalized Lorenz-Mie theory (GLMT). An analytical expression of beam shape coefficients (BSCs) is derived using angular spectrum decomposition method (ASDM), and the scattering coefficients are expanded using Debye series (DSE) in order to isolate the contribution of single scattering process. The internal and near-surface electric fields are numerically analyzed, and the effect of beam location, polarization, order of beam, half-cone angle, and scattering process (namely Debye mode p) are mainly discussed. Numerical results show that a curve formed by extreme peaks can be observed, and the electric fields can be locally enhanced after the interaction of AGVBBs with the particle. Internal and near-surface fields, especially its local enhancement, are very sensitive to the beam parameters, including polarization, order, half-cone angle, etc. The internal fields can also be enhanced by various scattering process (or Debye mode p). Such results have important applications in various fields, including particle sizing, optical tweezers, etc. - Highlights: • Debye series is employed to the analysis of internal and near-surface fields for a sphere illuminated by a vector Bessel beam. • Analytical expressions of BSCs for vector Bessel beams with selected polarizations are derived using ASDM. • The local enhancement of internal and near-surface fields is investigated. • The polarization, order, half-cone angle of the beam affect the local enhancement. • The local enhancement of internal fields is sensitive to the scattering process.

  4. AlphaScreen-based homogeneous assay using a pair of 25-residue artificial proteins for high-throughput analysis of non-native IgG.

    Science.gov (United States)

    Senga, Yukako; Imamura, Hiroshi; Miyafusa, Takamitsu; Watanabe, Hideki; Honda, Shinya

    2017-09-29

    Therapeutic IgG becomes unstable under various stresses in the manufacturing process. The resulting non-native IgG molecules tend to associate with each other and form aggregates. Because such aggregates not only decrease the pharmacological effect but also become a potential risk factor for immunogenicity, rapid analysis of aggregation is required for quality control of therapeutic IgG. In this study, we developed a homogeneous assay using AlphaScreen and AF.2A1. AF.2A1 is a 25-residue artificial protein that binds specifically to non-native IgG generated under chemical and physical stresses. This assay is performed in a short period of time. Our results show that AF.2A1-AlphaScreen may be used to evaluate the various types of IgG, as AF.2A1 recognizes the non-native structure in the constant region (Fc region) of IgG. The assay was effective for detection of non-native IgG, with particle size up to ca. 500 nm, generated under acid, heat, and stirring conditions. In addition, this technique is suitable for analyzing non-native IgG in CHO cell culture supernatant and mixed with large amounts of native IgG. These results indicate the potential of AF.2A1-AlphaScreen to be used as a high-throughput evaluation method for process monitoring as well as quality testing in the manufacturing of therapeutic IgG.

  5. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  6. Open Source Analysis in Support to Nonproliferation Monitoring and Verification Activities: Using the New Media to Derive Unknown New Information

    International Nuclear Information System (INIS)

    Pabian, F.; Renda, G.; Jungwirth, R.; Kim, L.; Wolfart, E.; Cojazzi, G.G.M.; )

    2015-01-01

    This paper will describe evolving techniques that leverage freely available open source social media venues, sometimes referred to as the ''New Media,'' together with geospatial tools and commercial satellite imagery (with its ever improving spatial, spectral, and temporal resolutions), to expand the existing nuclear non-proliferation knowledge base by way of a review of some recent exemplar cases. The application of such techniques can enhance more general data mining, as those techniques can be more directly tailored to IAEA Safeguards monitoring and other non-proliferation verification activities to improve the possibility of the remote detection of undeclared nuclear related facilities and/or activities. As part of what might be called the new ''Societal Verification'' regime, these techniques have enlisted either the passive or active involvement of interested parties (NGOs, academics, and even hobbyists) using open sources and collaboration networks together with previously highlighted geospatial visualization tools and techniques. This paper will show how new significant, and unprecedented, information discoveries have already been made (and published in open source) in the last four years, i.e., since the last IAEA Safeguards Symposium. With respect to the possibility of soliciting active participation (e.g., ''crowd-sourcing'') via social media, one can envision scenarios (one example from open source will be provided) whereby a previously unknown nuclear related facility could be identified or located through the online posting of reports, line drawings, and/or ground photographs. Nonetheless, these techniques should not be viewed as a panacea, as examples of both deception and human error will also be provided. This paper will highlight the use of these remote-means of discovery techniques, and how they have shed entirely new light on important nuclear non-proliferation relevant issues in

  7. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  8. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  9. The relationship between continuum homogeneity and statistical homogeneity in cosmology

    International Nuclear Information System (INIS)

    Stoeger, W.R.; Ellis, G.F.R.; Hellaby, C.

    1987-01-01

    Although the standard Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe models are based on the concept that the Universe is spatially homogeneous, up to the present time no definition of this concept has been proposed that could in principle be tested by observation. Such a definition is here proposed, based on a simple spatial averaging procedure, which relates observable properties of the Universe to the continuum homogeneity idea that underlies the FLRW models. It turns out that the statistical homogeneity often used to describe the distribution of matter on a large scale does not imply spatial homogeneity according to this definition, and so cannot be simply related to a FLRW Universe model. Values are proposed for the homogeneity parameter and length scale of homogeneity of the Universe. (author)

  10. Verification of analysis methods for predicting the behaviour of seismically isolated nuclear structures. Final report of a co-ordinated research project 1996-1999

    International Nuclear Information System (INIS)

    2002-06-01

    This report is a summary of the work performed under a co-ordinated research project (CRP) entitled Verification of Analysis Methods for Predicting the Behaviour of Seismically isolated Nuclear Structures. The project was organized by the IAEA on the recommendation of the IAEA's Technical Working Group on Fast Reactors (TWGFR) and carried out from 1996 to 1999. One of the primary requirements for nuclear power plants and facilities is to ensure safety and the absence of damage under strong external dynamic loading from, for example, earthquakes. The designs of liquid metal cooled fast reactors (LMFRs) include systems which operate at low pressure and include components which are thin-walled and flexible. These systems and components could be considerably affected by earthquakes in seismic zones. Therefore, the IAEA through its advanced reactor technology development programme supports the activities of Member States to apply seismic isolation technology to LMFRs. The application of this technology to LMFRs and other nuclear plants and related facilities would offer the advantage that standard designs may be safely used in areas with a seismic risk. The technology may also provide a means of seismically upgrading nuclear facilities. Design analyses applied to such critical structures need to be firmly established, and the CRP provided a valuable tool in assessing their reliability. Ten organizations from India, Italy, Japan, the Republic of Korea, the Russian Federation, the United Kingdom, the United States of America and the European Commission co-operated in this CRP. This report documents the CRP activities, provides the main results and recommendations and includes the work carried out by the research groups at the participating institutes within the CRP on verification of their analysis methods for predicting the behaviour of seismically isolated nuclear structures

  11. Steady- and transient-state analysis of fully ceramic microencapsulated fuel with randomly dispersed tristructural isotropic particles via two-temperature homogenized model-II: Applications by coupling with COREDAX

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin

    2016-01-01

    In Part I of this paper, the two-temperature homogenized model for the fully ceramic microencapsulated fuel, in which tristructural isotropic particles are randomly dispersed in a fine lattice stochastic structure, was discussed. In this model, the fuel-kernel and silicon carbide matrix temperatures are distinguished. Moreover, the obtained temperature profiles are more realistic than those obtained using other models. Using the temperature-dependent thermal conductivities of uranium nitride and the silicon carbide matrix, temperature-dependent homogenized parameters were obtained. In Part II of the paper, coupled with the COREDAX code, a reactor core loaded by fully ceramic microencapsulated fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure is analyzed via a two-temperature homogenized model at steady and transient states. The results are compared with those from harmonic- and volumetric-average thermal conductivity models; i.e., we compare keff eigenvalues, power distributions, and temperature profiles in the hottest single channel at a steady state. At transient states, we compare total power, average energy deposition, and maximum temperatures in the hottest single channel obtained by the different thermal analysis models. The different thermal analysis models and the availability of fuel-kernel temperatures in the two-temperature homogenized model for Doppler temperature feedback lead to significant differences

  12. Chemical state analysis of iron(III) compounds precipitated homogeneously from solutions containing urea by means of Moessbauer spectrometry and x-ray diffractometry

    International Nuclear Information System (INIS)

    Ujihira, Yusuke; Ohyabu, Matashige; Murakami, Tetsuro; Horie, Tsuyoshi.

    1978-01-01

    Chemical states of iron(III) compounds, precipitated homogeneously by heating the iron(III) salt solution at 363 K in the presence of urea, was studied by means of Moessbauer spectrometry and X-ray diffractometry. The pH-time relation of urea hydrolysis revealed that the precipitation process from homogeneous solution is identical to the hydrolysis of iron(III) ion at pH around 2 under the homogeneous supply of OH - ion, which is generated by hydrolysis of urea. Accordingly, iron(III) oxide hydroxide or similar compounds to the hydrolysis products of iron(III) ion was precipitated by the precipitation from homogeneous solution methods. Akaganeite (β-FeOOH) was crystallized from 0.1 M iron(III) chloride solution. Goethite(α-FeOOH) and hematite(α-Fe 2 O 3 ) was precipitated from 0.1 M iron(III) nitrate solution, vigorous liberation of OH - ion favoring the crystallization of hematite. The addition of chloride ion to the solution resulted in the formation of akaganeite. Basic salt of iron sulfate[NH 4 Fe 3 (OH) 6 (SO 4 ) 2 ] and goethite were formed from 0.1 M iron(III) sulfate solution, the former being obtained in the more moderate condition of the urea hydrolysis ( 363 K). (author)

  13. Steady- and transient-state analysis of fully ceramic microencapsulated fuel with randomly dispersed tristructural isotropic particles via two-temperature homogenized model-I: Theory and method

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin

    2016-01-01

    As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM) fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC) matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1) matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2) preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1) they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2) they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained

  14. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  15. Homogenization of resonant chiral metamaterials

    OpenAIRE

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten; Malureanu, Radu; Lederer, Falk; Lavrinenko, Andrei

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as e.g. propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size a critical density exists above which increasing coupling between neighboring meta-atoms prevails a reasonable homogenization. On the contrary, a dilution in excess will induce features reminiscent to pho...

  16. Bilipschitz embedding of homogeneous fractals

    OpenAIRE

    Lü, Fan; Lou, Man-Li; Wen, Zhi-Ying; Xi, Li-Feng

    2014-01-01

    In this paper, we introduce a class of fractals named homogeneous sets based on some measure versions of homogeneity, uniform perfectness and doubling. This fractal class includes all Ahlfors-David regular sets, but most of them are irregular in the sense that they may have different Hausdorff dimensions and packing dimensions. Using Moran sets as main tool, we study the dimensions, bilipschitz embedding and quasi-Lipschitz equivalence of homogeneous fractals.

  17. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  18. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  19. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  20. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  1. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  2. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  3. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  4. Classical entropy generation analysis in cooled homogenous and functionally graded material slabs with variation of internal heat generation with temperature, and convective–radiative boundary conditions

    International Nuclear Information System (INIS)

    Torabi, Mohsen; Zhang, Kaili

    2014-01-01

    This article investigates the classical entropy generation in cooled slabs. Two types of materials are assumed for the slab: homogeneous material and FGM (functionally graded material). For the homogeneous material, the thermal conductivity is assumed to be a linear function of temperature, while for the FGM slab the thermal conductivity is modeled to vary in accordance with the rule of mixtures. The boundary conditions are assumed to be convective and radiative concurrently, and the internal heat generation of the slab is a linear function of temperature. Using the DTM (differential transformation method) and resultant temperature fields from the DTM, the local and total entropy generation rates within slabs are derived. The effects of physically applicable parameters such as the thermal conductivity parameter for the homogenous slab, β, the thermal conductivity parameter for the FGM slab, γ, gradient index, j, internal heat generation parameter, Q, Biot number at the right side, Nc 2 , conduction–radiation parameter, Nr 2 , dimensionless convection sink temperature, δ, and dimensionless radiation sink temperature, η, on the local and total entropy generation rates are illustrated and explained. The results demonstrate that considering temperature- or coordinate-dependent thermal conductivity and radiation heat transfer at both sides of the slab have great effects on the entropy generation. - Highlights: • The paper investigates entropy generation in a slab due to heat generation and convective–radiative boundary conditions. • Both homogeneous material and FGM (functionally graded material) were considered. • The calculations are carried out using the differential transformation method which is a well-tested analytical technique

  5. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  6. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  7. Homogeneous versus heterogeneous zeolite nucleation

    NARCIS (Netherlands)

    Dokter, W.H.; Garderen, van H.F.; Beelen, T.P.M.; Santen, van R.A.; Bras, W.

    1995-01-01

    Aggregates of fractal dimension were found in the intermediate gel phases that organize prior to nucleation and crystallization (shown right) of silicalite from a homogeneous reaction mixture. Small- and wide-angle X-ray scattering studies prove that for zeolites nucleation may be homogeneous or

  8. Homogenization theory in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1986-02-01

    The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr

  9. Monte Carlo investigation of collapsed versus rotated IMRT plan verification.

    Science.gov (United States)

    Conneely, Elaine; Alexander, Andrew; Ruo, Russell; Chung, Eunah; Seuntjens, Jan; Foley, Mark J

    2014-05-08

    IMRT QA requires, among other tests, a time-consuming process of measuring the absorbed dose, at least to a point, in a high-dose, low-dose-gradient region. Some clinics use a technique of measuring this dose with all beams delivered at a single gantry angle (collapsed delivery), as opposed to the beams delivered at the planned gantry angle (rotated delivery). We examined, established, and optimized Monte Carlo simulations of the dosimetry for IMRT verification of treatment plans for these two different delivery modes (collapsed versus rotated). The results of the simulations were compared to the treatment planning system dose calculations for the two delivery modes, as well as to measurements taken. This was done in order to investigate the validity of the use of a collapsed delivery technique for IMRT QA. The BEAMnrc, DOSXYZnrc, and egs_chamber codes were utilized for the Monte Carlo simulations along with the MMCTP system. A number of different plan complexity metrics were also used in the analysis of the dose distributions in a bid to qualify why verification in a collapsed delivery may or may not be optimal for IMRT QA. Following the Alfonso et al. formalism, the kfclin,frefQclin,Q correction factor was calculated to correct the deviation of small fields from the reference conditions used for beam calibration. We report on the results obtained for a cohort of 20 patients. The plan complexity was investigated for each plan using the complexity metrics of homogeneity index, conformity index, modulation complexity score, and the fraction of beams from a particular plan that intersect the chamber when performing the QA. Rotated QA gives more consistent results than the collapsed QA technique. The kfclin,frefQclin,Qfactor deviates less from 1 for rotated QA than for collapsed QA. If the homogeneity index is less than 0.05 then the kfclin,frefQclin,Q factor does not deviate from unity by more than 1%. A value this low for the homogeneity index can only be obtained

  10. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  11. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  12. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  13. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  14. Homogenate-assisted Vacuum-powered Bubble Extraction of Moso Bamboo Flavonoids for On-line Scavenging Free Radical Capacity Analysis.

    Science.gov (United States)

    Sun, Yinnan; Yang, Kui; Cao, Qin; Sun, Jinde; Xia, Yu; Wang, Yinhang; Li, Wei; Ma, Chunhui; Liu, Shouxin

    2017-07-11

    A homogenate-assisted vacuum-powered bubble extraction (HVBE) method using ethanol was applied for extraction of flavonoids from Phyllostachys pubescens (P. pubescens) leaves. The mechanisms of homogenate-assisted extraction and vacuum-powered bubble generation were discussed in detail. Furthermore, a method for the rapid determination of flavonoids by HPLC was established. HVBE followed by HPLC was successfully applied for the extraction and quantification of four flavonoids in P. pubescens , including orientin, isoorientin, vitexin, and isovitexin. This method provides a fast and effective means for the preparation and determination of plant active components. Moreover, the on-line antioxidant capacity, including scavenging positive ion and negative ion free radical capacity of different fractions from the bamboo flavonoid extract was evaluated. Results showed that the scavenging DPPH ˙ free radical capacity of vitexin and isovitexin was larger than that of isoorientin and orientin. On the contrary, the scavenging ABTS⁺ ˙ free radical capacity of isoorientin and orientin was larger than that of vitexin and isovitexin.

  15. LC/MS/MS analysis of α-tocopherol and coenzyme Q10 content in lyophilized royal jelly, beebread and drone homogenate.

    Science.gov (United States)

    Hryniewicka, Marta; Karpinska, Agnieszka; Kijewska, Marta; Turkowicz, Monika Joanna; Karpinska, Joanna

    2016-11-01

    This study shows the results of application liquid chromatography-tandem mass spectrometry (LC/MS/MS) for assay of the content of α-tocopherol and coenzyme Q 10 in bee products of animal origin, i.e. royal jelly, beebread and drone homogenate. The biological matrix was removed using extraction with n-hexane. It was found that drone homogenate is a rich source of coenzyme Q 10 . It contains only 8 ± 1 µg/g of α-tocopherol and 20 ± 2 µg/g of coenzyme Q 10 . The contents of assayed compounds in royal jelly were 16 ± 3 and 8 ± 0.2 µg/g of α-tocopherol and coenzyme Q 10 , respectively. Beebread appeared to be the richest of α-tocopherol. Its level was 80 ± 30 µg/g, while the level of coenzyme Q 10 was only 11.5 ± 0.3 µg/g. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Verification of LOCA/ECCS analysis codes ALARM-B2 and THYDE-B1 by comparison with RELAP4/MOD6/U4/J3

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    1982-08-01

    For a verification study of ALARM-B2 code and THYDE-B1 code which are the component of the JAERI code system for evaluation of BWR ECCS performance, calculations for typical small and large break LOCA in BWR were done, and compared with those by RELAP4/MOD6/U4/J3 code. This report describes the influences of differences between the analytical models incorporated in the individual code and the problems identified by this verification study. (author)

  17. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  18. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  19. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  20. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  1. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  2. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. A second stage homogenization method

    International Nuclear Information System (INIS)

    Makai, M.

    1981-01-01

    A second homogenization is needed before the diffusion calculation of the core of large reactors. Such a second stage homogenization is outlined here. Our starting point is the Floquet theorem for it states that the diffusion equation for a periodic core always has a particular solution of the form esup(j)sup(B)sup(x) u (x). It is pointed out that the perturbation series expansion of function u can be derived by solving eigenvalue problems and the eigenvalues serve to define homogenized cross sections. With the help of these eigenvalues a homogenized diffusion equation can be derived the solution of which is cos Bx, the macroflux. It is shown that the flux can be expressed as a series of buckling. The leading term in this series is the well known Wigner-Seitz formula. Finally three examples are given: periodic absorption, a cell with an absorber pin in the cell centre, and a cell of three regions. (orig.)

  5. Homogenization methods for heterogeneous assemblies

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1980-01-01

    The third session of the IAEA Technical Committee Meeting is concerned with the problem of homogenization of heterogeneous assemblies. Six papers will be presented on the theory of homogenization and on practical procedures for deriving homogenized group cross sections and diffusion coefficients. That the problem of finding so-called ''equivalent'' diffusion theory parameters for the use in global reactor calculations is of great practical importance. In spite of this, it is fair to say that the present state of the theory of second homogenization is far from being satisfactory. In fact, there is not even a uniquely accepted approach to the problem of deriving equivalent group diffusion parameters. Common agreement exists only about the fact that the conventional flux-weighting technique provides only a first approximation, which might lead to acceptable results in certain cases, but certainly does not guarantee the basic requirement of conservation of reaction rates

  6. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  7. Spinor structures on homogeneous spaces

    International Nuclear Information System (INIS)

    Lyakhovskii, V.D.; Mudrov, A.I.

    1993-01-01

    For multidimensional models of the interaction of elementary particles, the problem of constructing and classifying spinor fields on homogeneous spaces is exceptionally important. An algebraic criterion for the existence of spinor structures on homogeneous spaces used in multidimensional models is developed. A method of explicit construction of spinor structures is proposed, and its effectiveness is demonstrated in examples. The results are of particular importance for harmonic decomposition of spinor fields

  8. A personal view on homogenization

    International Nuclear Information System (INIS)

    Tartar, L.

    1987-02-01

    The evolution of some ideas is first described. Under the name homogenization are collected all the mathematical results who help understanding the relations between the microstructure of a material and its macroscopic properties. Homogenization results are given through a critically detailed bibliography. The mathematical models given are systems of partial differential equations, supposed to describe some properties at a scale ε and we want to understand what will happen to the solutions if ε tends to 0

  9. Sampling and Analysis Plan for Verification Sampling of LANL-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-30

    Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.

  10. Verification of the FBR fuel bundle-duct interaction analysis code BAMBOO by the out-of-pile bundle compression test with large diameter pins

    Science.gov (United States)

    Uwaba, Tomoyuki; Ito, Masahiro; Nemoto, Junichi; Ichikawa, Shoichi; Katsuyama, Kozo

    2014-09-01

    The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle-duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.

  11. Verification of the FBR fuel bundle–duct interaction analysis code BAMBOO by the out-of-pile bundle compression test with large diameter pins

    Energy Technology Data Exchange (ETDEWEB)

    Uwaba, Tomoyuki, E-mail: uwaba.tomoyuki@jaea.go.jp [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ito, Masahiro; Nemoto, Junichi [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ichikawa, Shoichi [Japan Atomic Energy Agency, 2-1, Shiraki, Tsuruga-shi, Fukui 919-1279 (Japan); Katsuyama, Kozo [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan)

    2014-09-15

    The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle–duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.

  12. Methodology for Thermal Behaviour Assessment of Homogeneous Façades in Heritage Buildings

    Directory of Open Access Journals (Sweden)

    Enrique Gil

    2017-01-01

    Full Text Available It is fundamental to study the thermal behaviour in all architectural constructions throughout their useful life, in order to detect early deterioration ensuring durability, in addition to achieving and maintaining the interior comfort with the minimum energy consumption possible. This research has developed a methodology to assess the thermal behaviour of façades in heritage buildings. This paper presents methodology validation and verification (V & V through a laboratory experiment. Guidelines and conclusions are extracted with the employment of three techniques in this experiment (thermal sensors, thermal imaging camera, and 3D thermal simulation in finite element software. A small portion of a homogeneous façade has been reproduced with indoor and outdoor thermal conditions. A closed chamber was constructed with wood panels and thermal insulation, leaving only one face exposed to the outside conditions, with a heat source inside the chamber that induces a temperature gradient in the wall. With this methodology, it is possible to better understand the thermal behaviour of the façade and to detect possible damage with the calibration and comparison of the results obtained by the experimental and theoretical techniques. This methodology can be extrapolated to the analysis of the thermal behaviour of façades in heritage buildings, usually made up of homogeneous material.

  13. Homogenized rigid body and spring-mass (HRBSM) model for the pushover analysis of out-of-plane loaded unreinforced and FRP reinforced walls

    Science.gov (United States)

    Bertolesi, Elisa; Milani, Gabriele

    2017-07-01

    The present paper is devoted to the discussion of a series of unreinforced and FRP retrofitted panels analyzed adopting the Rigid Body and Spring-Mass (HRBSM) model developed by the authors. To this scope, a total of four out of plane loaded masonry walls tested up to failure are considered. At a structural level, the non-linear analyses are conducted replacing the homogenized orthotropic continuum with a rigid element and non-linear spring assemblage by means of which out of plane mechanisms are allowed. FRP retrofitting is modeled adopting two noded truss elements whose mechanical properties are selected in order to describe possible debonding phenomenon or tensile rupture of the strengthening. The outcome provided numerically are compared to the experimental results showing a satisfactory agreement in terms of global pressure-deflection curves and failure mechanisms.

  14. Effects of Direct Fuel Injection Strategies on Cycle-by-Cycle Variability in a Gasoline Homogeneous Charge Compression Ignition Engine: Sample Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Jacek Hunicz

    2015-01-01

    Full Text Available In this study we summarize and analyze experimental observations of cyclic variability in homogeneous charge compression ignition (HCCI combustion in a single-cylinder gasoline engine. The engine was configured with negative valve overlap (NVO to trap residual gases from prior cycles and thus enable auto-ignition in successive cycles. Correlations were developed between different fuel injection strategies and cycle average combustion and work output profiles. Hypothesized physical mechanisms based on these correlations were then compared with trends in cycle-by-cycle predictability as revealed by sample entropy. The results of these comparisons help to clarify how fuel injection strategy can interact with prior cycle effects to affect combustion stability and so contribute to design control methods for HCCI engines.

  15. Sensitivity analysis for thermo-hydraulics model of a Westinghouse type PWR. Verification of the simulation results

    Energy Technology Data Exchange (ETDEWEB)

    Farahani, Aref Zarnooshe [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Nuclear Engineering, Science and Research Branch; Yousefpour, Faramarz [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Basic Sciences; Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Young Researchers and Elite Club

    2017-07-15

    Development of a steady-state model is the first step in nuclear safety analysis. The developed model should be qualitatively analyzed first, then a sensitivity analysis is required on the number of nodes for models of different systems to ensure the reliability of the obtained results. This contribution aims to show through sensitivity analysis, the independence of modeling results to the number of nodes in a qualified MELCOR model for a Westinghouse type pressurized power plant. For this purpose, and to minimize user error, the nuclear analysis software, SNAP, is employed. Different sensitivity cases were developed by modification of the existing model and refinement of the nodes for the simulated systems including steam generators, reactor coolant system and also reactor core and its connecting flow paths. By comparing the obtained results to those of the original model no significant difference is observed which is indicative of the model independence to the finer nodes.

  16. Thermal properties Forsmark. Modelling stage 2.3 Complementary analysis and verification of the thermal bedrock model, stage 2.

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Laendell, Maerta (Geo Innova AB (Sweden)); Back, Paer-Erik; Rosen, Lars (Sweco AB (Sweden))

    2008-11-15

    . This temperature dependence tends to decrease as the thermal conductivity decreases. - Heat capacity: Domains RFM029 and RFM045 have a mean heat capacity of 2.06 MJ/(m3K) and 2.15 MJ/(m3K) respectively. - The mean in situ temperatures at 400 m, 500 m and 600 m depth are estimated at 10.5 deg C, 11.6 deg C, and 12.8 deg C respectively, and are therefore unchanged compared to model stage 2.2. - The estimates of the TRC (thermal rock class) proportions in domain RFM029 are considerably more reliable than those for domain RFM045. For the latter, the small number of boreholes in combination with the higher degree of lithological heterogeneity results in rather large uncertainties in the estimated proportions. - The aspect of the thermal model with the highest confidence is the thermal conductivity distribution of domain RFM029, because of its higher degree of lithological and thermal homogeneity compared to domain RFM045 - The aspect of the thermal model with the lowest confidence is the lower tail of the thermal conductivity distribution for rock domain RFM045. This uncertainty is related to the spatial and size distribution of amphibolite in domain RFM045

  17. Non-linear waves in heterogeneous elastic rods via homogenization

    KAUST Repository

    Quezada de Luna, Manuel

    2012-03-01

    We consider the propagation of a planar loop on a heterogeneous elastic rod with a periodic microstructure consisting of two alternating homogeneous regions with different material properties. The analysis is carried out using a second-order homogenization theory based on a multiple scale asymptotic expansion. © 2011 Elsevier Ltd. All rights reserved.

  18. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  19. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  20. Homogenization of neutronic diffusion models

    International Nuclear Information System (INIS)

    Capdebosq, Y.

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  1. 7 CFR 58.920 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.920 Section 58.920 Agriculture... Procedures § 58.920 Homogenization. Where applicable concentrated products shall be homogenized for the... homogenization and the pressure at which homogenization is accomplished will be that which accomplishes the most...

  2. Performance analysis and experimental verification of mid-range wireless energy transfer through non-resonant magnetic coupling

    DEFF Research Database (Denmark)

    Peng, Liang; Wang, Jingyu; Zhejiang University, Hangzhou, China, L.

    2011-01-01

    In this paper, the efficiency analysis of a mid-range wireless energy transfer system is performed through non-resonant magnetic coupling. It is shown that the self-resistance of the coils and the mutual inductance are critical in achieving a high efficiency, which is indicated by our theoretical...

  3. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  4. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  5. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  6. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  7. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  8. Mechanized syringe homogenization of human and animal tissues.

    Science.gov (United States)

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  9. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  10. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  11. Analysis and Experimental Verification of New Power Flow Control for Grid-Connected Inverter with LCL Filter in Microgrid

    Science.gov (United States)

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  12. Outlines and verifications of the codes used in the safety analysis of High Temperature Engineering Test Reactor (HTTR)

    International Nuclear Information System (INIS)

    Shiina, Yasuaki; Kunitomi, Kazuhiko; Maruyama, Soh; Fujita, Shigeki; Nakagawa, Shigeaki; Iyoku, Tatsuo; Shindoh, Masami; Sudo, Yukio; Hirano, Masashi.

    1990-03-01

    This paper presents brief description of the computer codes used in the safety analysis of High Temperature Engineering Test Reactor. The list of the codes is: 1. BLOOST-J2 2. THYDE-HTGR 3. TAC-NC 4. RATSAM6 5. COMPARE-MOD1 6. GRACE 7. OXIDE-3F 8. FLOWNET/TRUMP. Of described above, 1, 3, 4, 5, 6 and 7 were developed for the multi-hole type gas cooled reactor and improved for HTTR and 2 was originated by THYDE-codes which were developed to treat the transient thermo-hydraulics during LOCA of LWR. Each code adopted the models and properties which yield conservative analytical results. Adequacy of each code was verified by the comparison with the experimental results and/or the analytical results obtained from the other codes which were already proven. (author)

  13. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    Science.gov (United States)

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.

  14. Multileaf collimator leaf position verification and analysis for adaptive radiation therapy using a video-optical method

    Science.gov (United States)

    Sethna, Sohrab B.

    External beam radiation therapy is commonly used to eliminate and control cancerous tumors. High-energy beams are shaped to match the patient's specific tumor volume, whereby maximizing radiation dose to malignant cells and limiting dose to normal tissue. A multileaf collimator (MLC) consisting of multiple pairs of tungsten leaves is used to conform the radiation beam to the desired treatment field. Advanced treatment methods utilize dynamic MLC settings to conform to multiple treatment fields and provide intensity modulated radiation therapy (IMRT). Future methods would further increase conformity by actively tracking tumor motion caused by patient cardiac and respiratory motion. Leaf position quality assurance for a dynamic MLC is critical as variation between the planned and actual leaf positions could induce significant errors in radiation dose. The goal of this research project is to prototype a video-optical quality assurance system for MLC leaf positions. The system captures light-field images of MLC leaf sequences during dynamic therapy. Image acquisition and analysis software was developed to determine leaf edge positions. The mean absolute difference between QA prototype predicted and caliper measured leaf positions was found to be 0.6 mm with an uncertainty of +/- 0.3 mm. Maximum errors in predicted positions were below 1.0 mm for static fields. The prototype served as a proof of concept for quality assurance of future tumor tracking methods. Specifically, a lung tumor phantom was created to mimic a lung tumor's motion from respiration. The lung tumor video images were superimposed on MLC field video images for visualization and analysis. The toolbox is capable of displaying leaf position, leaf velocity, tumor position, and determining errors between planned and actual treatment fields for dynamic radiation therapy.

  15. Modification and verification of the program COBRA-RERTR for the application in the thermohydraulic analysis of research reactors

    International Nuclear Information System (INIS)

    Hainoun, A.; Ghazi, N.

    2005-01-01

    In the frame work of testing, evaluation and application of computer codes in the design and safety analysis of research reactors, the thermal hydraulic code COBRA-RERTR (Reduced Enriched Research and Test Reactor) has been tested and partially validated. COBRA-RERTR has been selected due to the available options, which are suitable for the analysis of research reactors which are operated at low temperatures, and which may use plate-type fuel elements and heavy water as the coolant. In addition to that, the code enables the consideration of cross-flow that is important in case of parallel and open coolant channel. The test of the code shows an overestimation of the wall temperature with an addition to some fluctuation from node to node. This results from the solution scheme that uses an explicit, non-iterative solution for heat conduction and heat transfer to the coolant. The code evaluation regarding the basic thermal hydraulic phenomena indicates the necessity to modify and extent the physical models deals with the estimation of slip ratio and simulation of void content in the sub-cooled boiling. The code has been validated by recalculation of special experiments on axial void distribution and thermal hydraulic instability in the subcooled boiling regime. The validation indicates significant improvement of the code in prediction the axial void distribution in subcooled boiling. The discrepancy between calculation and experiments was about 20% after the modification comparing to 100% in the original models. On the other hand the validation shows the capability of the modified code to stimulate thermal hydraulic flow instability characterized by the critical inlet flow velocity at which the flow just become unstable. This point is identical to the minimum in the integral pressure drop curve. The code results show, from the view point of reactor safety, conservative estimation since the predicted values of critical inlet velocity are higher than the experimental

  16. Modification and verification of the program COBRA-RERTR for the application in the thermohydraulic analysis of research reactors

    International Nuclear Information System (INIS)

    Hainoun, A.; Ghazi, N.

    2004-02-01

    In the frame work of testing, evaluation and application of computer codes in the design and safety analysis of research reactors, the thermal hydraulic code COBRA-RERTR (Reduced Enriched Research and Test Reactor) has been tested and partially validated. COBRA-RERTR has been selected due to the available options, which are suitable for the analysis of research reactors which are operated at low temperatures, and which may use plate-type fuel elements and heavy water as the coolant. In addition to that, the code enables the consideration of cross-flow that is important in case of parallel and open coolant channel. The test of the code shows an overestimation of the wall temperature with an addition to some fluctuation from node to node. This results from the solution scheme that uses an explicit, non-iterative solution for heat conduction and heat transfer to the coolant. The code evaluation regarding the basic thermal hydraulic phenomena indicates the necessity to modify and extent the physical models deals with the estimation of slip ratio and simulation of void content in the sub-cooled boiling. The code has been validated by recalculation of special experiments on axial void distribution and thermal hydraulic instability in the subcooled boiling regime. The validation indicates significant improvement of the code in prediction the axial void distribution in subcooled boiling. The discrepancy between calculation and experiments was about 20% after the modification comparing to 100% in the original models. On the other hand the validation shows the capability of the modified code to stimulate thermal hydraulic flow instability characterized by the critical inlet flow velocity at which the flow just become unstable. This point is identical to the minimum in the integral pressure drop curve. The code results show, from the view point of reactor safety, conservative estimation since the predicted values of critical inlet velocity are higher than the experimental

  17. Verification and Analysis of Implementing Virtual Electric Devices in Circuit Simulation of Pulsed DC Electrical Devices in the NI MULTISIM 10.1 Environment

    Directory of Open Access Journals (Sweden)

    V. A. Solov'ev

    2015-01-01

    Full Text Available The paper presents the analysis results of the implementation potential and evaluation of the virtual electric devices reliability when conducting circuit simulation of pulsed DC electrical devices in the NI Multisim 10.1environment. It analyses metrological properties of electric measuring devices and sensors of the NI Multisim 10.1environment. To calculate the reliable parameters of periodic non-sinusoidal electrical values based on their physical feasibility the mathematical expressions have been defined.To verify the virtual electric devices a circuit model of the power section of buck DC converter with enabled devices under consideration at its input and output is used as a consumer of pulse current of trapezoidal or triangular form. It is used as an example to show a technique to verify readings of virtual electric measuring devices in the NI Multisim 10.1environment.It is found that when simulating the pulsed DC electric devices to measure average and RMS voltage supply and current consumption values it is advisable to use the probe. Electric device power consumption read from the virtual power meter is equal to its average value, and its displayed power factor is inversely proportional to the input current form factor. To determine the RMS pulsed DC current by ammeter and multi-meter it is necessary to measure current by these devices in DC and AC modes, and then determine the RMS value of measurement results.Virtual electric devices verification has proved the possibility of their application to determine the energy performance of transistor converters for various purposes in the circuit simulation in the NI 10.1 Multisim environment, thus saving time of their designing.

  18. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    Science.gov (United States)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  19. EPRI's on-site soil-structure interaction research and its application to design/analysis verification

    Energy Technology Data Exchange (ETDEWEB)

    Stepp, J C; Tang, H T [Seismic Center, Electric Power Research Institute, Palo Alto, CA (United States)

    1988-07-01

    Soil structure, interaction (SSI) research at the Electric Power Research Institute (EPRI) is focused on validating modeling and computational procedures. A data base has been obtained with instrumented scale models of stiff structures founded both on unsaturated alluvial soils and on rock. Explosives were used to induce strong ground-motion for two experiments, one on rock and the other on alluvium. A third experiment, a one-fourth scale containment structure on saturated alluvium, relies on earthquakes as the energy source. Analysis of the explosion-induced SSI data shows a marked shift in the fundamental frequency of the soil-structure system to a lower frequency. The magnitude of the shift is a function of foundation conditions and level of excitation. Analytical simulation was found to require more sophisticated soil constitutive models and computer codes than are used in current practice. The current phase of the program concentrates on evaluating SSI models used in current design practice by comparing predicted with recorded data at points in the soil-structure system. (author)

  20. EPRI's on-site soil-structure interaction research and its application to design/analysis verification

    International Nuclear Information System (INIS)

    Stepp, J.C.; Tang, H.T.

    1988-01-01

    Soil structure, interaction (SSI) research at the Electric Power Research Institute (EPRI) is focused on validating modeling and computational procedures. A data base has been obtained with instrumented scale models of stiff structures founded both on unsaturated alluvial soils and on rock. Explosives were used to induce strong ground-motion for two experiments, one on rock and the other on alluvium. A third experiment, a one-fourth scale containment structure on saturated alluvium, relies on earthquakes as the energy source. Analysis of the explosion-induced SSI data shows a marked shift in the fundamental frequency of the soil-structure system to a lower frequency. The magnitude of the shift is a function of foundation conditions and level of excitation. Analytical simulation was found to require more sophisticated soil constitutive models and computer codes than are used in current practice. The current phase of the program concentrates on evaluating SSI models used in current design practice by comparing predicted with recorded data at points in the soil-structure system. (author)

  1. Improved and verification of fast reactor safety analysis techniques. Annual summary, March 1, 1975--February 29, 1976

    International Nuclear Information System (INIS)

    Jackson, J.F.; Bott, T.F.

    1976-01-01

    Analyses of the Kiwi-TNT and SNAPTRAN-2 experiments have been performed with the VENUS-II fast-reactor disassembly code. The results show that VENUS-II provides an adequate characterization of these experiments. As is the case for LMFBRs, the excursions were initially turned over by temperature feedback effects, with ultimate shutdown coming from core disassembly. The calculated fission energies agree with the experimental values to within about 50 percent for the Kiwi excursion and 10 percent for the SNAPTRAN-2 experiment. The results of the analyses are being evaluated to determine the reasons for the remaining differences. It appears that part of the difference observed in the Kiwi-TNT analysis could relate to not explicitly treating the heat-transfer from the beaded fuel (a problem not present in LMFBR calculations). Both analyses also have uncertainties associated with the new equation-of-state that had to be added to VENUS-II to allow treatment of the core materials not used in fast reactors. Finally, there are uncertainties in the temperature feedback coefficients being used. In general, the uncertainties associated with applying VENUS-II to LMFBR excursions should be even smaller than those encountered in these experimental comparisons. This is because the temperature (Doppler) coefficients and core material equations-of-state are better known, and the complications associated with heat transfer from the beaded fuel are not present

  2. Design of SC solenoid with high homogeneity

    International Nuclear Information System (INIS)

    Yang Xiaoliang; Liu Zhong; Luo Min; Luo Guangyao; Kang Qiang; Tan Jie; Wu Wei

    2014-01-01

    A novel kind of SC (superconducting) solenoid coil is designed to satisfy the homogeneity requirement of the magnetic field. In this paper, we first calculate the current density distribution of the solenoid coil section through the linear programming method. Then a traditional solenoid and a nonrectangular section solenoid are designed to produce a central field up to 7 T with a homogeneity to the greatest extent. After comparison of the two solenoid coils designed in magnet field quality, fabrication cost and other aspects, the new design of the nonrectangular section of a solenoid coil can be realized through improving the techniques of framework fabrication and winding. Finally, the outlook and error analysis of this kind of SC magnet coil are also discussed briefly. (authors)

  3. Smooth homogeneous structures in operator theory

    CERN Document Server

    Beltita, Daniel

    2005-01-01

    Geometric ideas and techniques play an important role in operator theory and the theory of operator algebras. Smooth Homogeneous Structures in Operator Theory builds the background needed to understand this circle of ideas and reports on recent developments in this fruitful field of research. Requiring only a moderate familiarity with functional analysis and general topology, the author begins with an introduction to infinite dimensional Lie theory with emphasis on the relationship between Lie groups and Lie algebras. A detailed examination of smooth homogeneous spaces follows. This study is illustrated by familiar examples from operator theory and develops methods that allow endowing such spaces with structures of complex manifolds. The final section of the book explores equivariant monotone operators and Kähler structures. It examines certain symmetry properties of abstract reproducing kernels and arrives at a very general version of the construction of restricted Grassmann manifolds from the theory of loo...

  4. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  5. Spontaneous compactification to homogeneous spaces

    International Nuclear Information System (INIS)

    Mourao, J.M.

    1988-01-01

    The spontaneous compactification of extra dimensions to compact homogeneous spaces is studied. The methods developed within the framework of coset space dimensional reduction scheme and the most general form of invariant metrics are used to find solutions of spontaneous compactification equations

  6. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  7. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator

    Science.gov (United States)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses

  8. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  9. Electro-magnetostatic homogenization of bianisotropic metamaterials

    OpenAIRE

    Fietz, Chris

    2012-01-01

    We apply the method of asymptotic homogenization to metamaterials with microscopically bianisotropic inclusions to calculate a full set of constitutive parameters in the long wavelength limit. Two different implementations of electromagnetic asymptotic homogenization are presented. We test the homogenization procedure on two different metamaterial examples. Finally, the analytical solution for long wavelength homogenization of a one dimensional metamaterial with microscopically bi-isotropic i...

  10. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan

    2016-06-06

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  11. Homogeneous Biosensing Based on Magnetic Particle Labels

    Science.gov (United States)

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  12. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschö pe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  13. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  14. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  15. Observational homogeneity of the Universe

    International Nuclear Information System (INIS)

    Bonnor, W.B.; Ellis, G.F.R.

    1986-01-01

    A new approach to observational homogeneity is presented. The observation that stars and galaxies in distant regions appear similar to those nearby may be taken to imply that matter has had a similar thermodynamic history in widely separated parts of the Universe (the Postulate of Uniform Thermal Histories, or PUTH). The supposition is now made that similar thermodynamic histories imply similar dynamical histories. Then the distant apparent similarity is evidence for spatial homogeneity of the Universe. General Relativity is used to test this idea, taking a perfect fluid model and implementing PUTH by the condition that the density and entropy per baryon shall be the same function of the proper time along all galaxy world-lines. (author)

  16. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  17. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  18. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  19. Conclusions about homogeneity and devitrification

    International Nuclear Information System (INIS)

    Larche, F.

    1997-01-01

    A lot of experimental data concerning homogeneity and devitrification of R7T7 glass have been published. It appears that: - the crystallization process is very limited, - the interfaces due to bubbles and the container wall favor crystallization locally but the ratio of crystallized volume remains always below a few per cents, and - crystallization has no damaging long-term effects as far as leaching tests can be trusted. (A.C.)

  20. Is charity a homogeneous good?

    OpenAIRE

    Backus, Peter

    2010-01-01

    In this paper I estimate income and price elasticities of donations to six different charitable causes to test the assumption that charity is a homogeneous good. In the US, charitable donations can be deducted from taxable income. This has long been recognized as producing a price, or taxprice, of giving equal to one minus the marginal tax rate faced by the donor. A substantial portion of the economic literature on giving has focused on estimating price and income elasticities of giving as th...

  1. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    International Nuclear Information System (INIS)

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ''Validation of the Seismic Analysis Codes Using the Reactor Code Experiments'' (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs

  2. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 2: Verification and improvement of reactor core seismic analysis codes using core mock-up experiments. Proceedings of a research co-ordination meeting held in Vienna, 26-28 September 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    This report (Volume II) contains the papers summarizing the verification of and improvement to the codes on the basis of the French and Japanese data. Volume I: ``Validation of the Seismic Analysis Codes Using the Reactor Code Experiments`` (IAEA-TECDOC-798) included the Italian PEC reactor data. Refs, figs and tabs.

  3. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  4. STEALTH: a Lagrange explicit finite difference code for solids, structural, and thermohydraulic analysis. Volume 2: sample and verification problems. Computer code manual

    International Nuclear Information System (INIS)

    Hofmann, R.

    1982-08-01

    STEALTH sample and verification problems are presented to help users become familiar with STEALTH capabilities, input, and output. Problems are grouped into articles which are completely self-contained. The pagination in each article is A.n, where A is a unique alphabetic-character article identifier and n is a sequential page number which starts from 1 on the first page of text for each article. Articles concerning new capabilities will be added as they become available. STEALTH sample and verification calculations are divided into the following general categories: transient mechanical calculations dealing with solids; transient mechanical calculations dealing with fluids; transient thermal calculations dealing with solids; transient thermal calculations dealing with fluids; static and quasi-static calculations; and complex boundary interaction calculations

  5. On stability and reflection-transmission analysis of the bipenalty method in contact-impact problems: A one-dimensional, homogeneous case study

    Czech Academy of Sciences Publication Activity Database

    Kopačka, Ján; Tkachuk, A.; Gabriel, Dušan; Kolman, Radek; Bischoff, M.; Plešek, Jiří

    2018-01-01

    Roč. 113, č. 10 (2018), s. 1607-1629 ISSN 0029-5981 R&D Projects: GA MŠk(CZ) EF15_003/0000493; GA ČR(CZ) GA17-22615S; GA ČR GA17-12925S; GA ČR(CZ) GA16-03823S Grant - others:AV ČR(CZ) DAAD-16-12 Program:Bilaterální spolupráce Institutional support: RVO:61388998 Keywords : bipenalty method * explicit time integration * finite element method * penalty method * reflection-transmission analysis * stability analysis Subject RIV: JC - Computer Hardware ; Software OBOR OECD: Applied mechanics Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.5712/full

  6. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  7. The laminar flow tube reactor as a quantitative tool for nucleation studies: Experimental results and theoretical analysis of homogeneous nucleation of dibutylphthalate

    International Nuclear Information System (INIS)

    Mikheev, Vladimir B.; Laulainen, Nels S.; Barlow, Stephan E.; Knott, Michael; Ford, Ian J.

    2000-01-01

    A laminar flow tube reactor was designed and constructed to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 to +19.1 degree sign C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) is given. Both isothermal and the isobaric nucleation rates were measured. The experimental data obtained were compared with the measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis, based on the first and the second nucleation theorems, is also presented. The critical cluster size and the excess of internal energy of the critical cluster are obtained. (c) 2000 American Institute of Physics

  8. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  9. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  10. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  11. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  12. SU-E-T-455: Impact of Different Independent Dose Verification Software Programs for Secondary Check

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Kosaka, M; Kobayashi, N; Yamashita, M; Ishibashi, S; Higuchi, Y; Tachibana, H

    2015-01-01

    Purpose: There have been many reports for different dose calculation algorithms for treatment planning system (TPS). Independent dose verification program (IndpPro) is essential to verify clinical plans from the TPS. However, the accuracy of different independent dose verification programs was not evident. We conducted a multi-institutional study to reveal the impact of different IndpPros using different TPSs. Methods: Three institutes participated in this study. They used two different IndpPros (RADCALC and Simple MU Analysis (SMU), which implemented the Clarkson algorithm. RADCALC needed the input of radiological path length (RPL) computed by the TPSs (Eclipse or Pinnacle3). SMU used CT images to compute the RPL independently from TPS). An ion-chamber measurement in water-equivalent phantom was performed to evaluate the accuracy of two IndpPros and the TPS in each institute. Next, the accuracy of dose calculation using the two IndpPros compared to TPS was assessed in clinical plan. Results: The accuracy of IndpPros and the TPSs in the homogenous phantom was +/−1% variation to the measurement. 1543 treatment fields were collected from the patients treated in the institutes. The RADCALC showed better accuracy (0.9 ± 2.2 %) than the SMU (1.7 ± 2.1 %). However, the accuracy was dependent on the TPS (Eclipse: 0.5%, Pinnacle3: 1.0%). The accuracy of RADCALC with Eclipse was similar to that of SMU in one of the institute. Conclusion: Depending on independent dose verification program, the accuracy shows systematic dose accuracy variation even though the measurement comparison showed a similar variation. The variation was affected by radiological path length calculation. IndpPro with Pinnacle3 has different variation because Pinnacle3 computed the RPL using physical density. Eclipse and SMU uses electron density, though

  13. Physical applications of homogeneous balls

    CERN Document Server

    Scarr, Tzvi

    2005-01-01

    One of the mathematical challenges of modern physics lies in the development of new tools to efficiently describe different branches of physics within one mathematical framework. This text introduces precisely such a broad mathematical model, one that gives a clear geometric expression of the symmetry of physical laws and is entirely determined by that symmetry. The first three chapters discuss the occurrence of bounded symmetric domains (BSDs) or homogeneous balls and their algebraic structure in physics. The book further provides a discussion of how to obtain a triple algebraic structure ass

  14. Heterotic strings on homogeneous spaces

    International Nuclear Information System (INIS)

    Israel, D.; Kounnas, C.; Orlando, D.; Petropoulos, P.M.

    2005-01-01

    We construct heterotic string backgrounds corresponding to families of homogeneous spaces as exact conformal field theories. They contain left cosets of compact groups by their maximal tori supported by NS-NS 2-forms and gauge field fluxes. We give the general formalism and modular-invariant partition functions, then we consider some examples such as SU(2)/U(1)∝S 2 (already described in a previous paper) and the SU(3)/U(1) 2 flag space. As an application we construct new supersymmetric string vacua with magnetic fluxes and a linear dilaton. (Abstract Copyright [2005], Wiley Periodicals, Inc.)

  15. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  16. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  17. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2003-12-01

    Full Text Available Abstract Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium.

  18. VERIFICATION OF THE SENTINEL-4 FOCAL PLANE SUBSYSTEM

    Directory of Open Access Journals (Sweden)

    C. Williges

    2017-05-01

    Full Text Available The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs, one for the UV-VIS spectral range (305 nm … 500 nm, the second for NIR (750 nm … 775 nm. In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM which will also be used for the upcoming Flight Model (FM verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.

  19. Analytical solutions of time-fractional models for homogeneous Gardner equation and non-homogeneous differential equations

    Directory of Open Access Journals (Sweden)

    Olaniyi Samuel Iyiola

    2014-09-01

    Full Text Available In this paper, we obtain analytical solutions of homogeneous time-fractional Gardner equation and non-homogeneous time-fractional models (including Buck-master equation using q-Homotopy Analysis Method (q-HAM. Our work displays the elegant nature of the application of q-HAM not only to solve homogeneous non-linear fractional differential equations but also to solve the non-homogeneous fractional differential equations. The presence of the auxiliary parameter h helps in an effective way to obtain better approximation comparable to exact solutions. The fraction-factor in this method gives it an edge over other existing analytical methods for non-linear differential equations. Comparisons are made upon the existence of exact solutions to these models. The analysis shows that our analytical solutions converge very rapidly to the exact solutions.

  20. In Situ Object Counting System (ISOCS) Technique: Cost-Effective Tool for NDA Verification in IAEA Safeguards

    International Nuclear Information System (INIS)

    Braverman, E.; Lebrun, A.; Nizhnik, V.; Rorif, F.

    2010-01-01

    Uranium materials measurements using the ISOCS technique play an increasing role in IAEA verification activities. This methodology provides high uranium/plutonium sensitivity and a low detection limit together with the capability to measure items with different shapes and sizes. In addition, the numerical absolute efficiency calibration of a germanium detector which is used by the technique does not require any calibration standards or reference materials. ISOCS modelling software allows performing absolute efficiency calibration for items of arbitrary container shape and wall material, matrix chemical composition, material fill-height, uranium or plutonium weight fraction inside the matrix and even nuclear material/matrix non-homogeneous distribution. Furthermore, in a number of cases, some key parameters such as matrix density and U/Pu weight fraction can be determined along with analysis of nuclear material mass and isotopic composition. These capabilities provide a verification solution suitable for a majority of cases where quantitative and isotopic analysis should be performed. Today, the basic tool for uranium and plutonium mass measurement used in Safeguards verification activities is the neutron counting technique which employs neutron coincidence and multiplicity counters. In respect to the neutron counting technique, ISOCS calibrated detectors have relatively low cost. Taking into account its advantages, this methodology becomes a cost-effective solution for nuclear material NDA verification. At present, the Agency uses ISOCS for quantitative analysis in a wide range of applications: - Uranium scrap materials; - Uranium contaminated solid wastes; - Uranium fuel elements; - Some specific verification cases like measurement of Pu-Be neutron sources, quantification of fission products in solid wastes etc. For uranium hold-up measurements, ISOCS the only available methodology for quantitative and isotopic composition analysis of nuclear materials deposited

  1. The stellar initial mass function of early-type galaxies from low to high stellar velocity dispersion: homogeneous analysis of ATLAS3D and Sloan Lens ACS galaxies

    Science.gov (United States)

    Posacki, Silvia; Cappellari, Michele; Treu, Tommaso; Pellegrini, Silvia; Ciotti, Luca

    2015-01-01

    We present an investigation about the shape of the initial mass function (IMF) of early-type galaxies (ETGs), based on a joint lensing and dynamical analysis, and on stellar population synthesis models, for a sample of 55 lens ETGs identified by the Sloan Lens Advanced Camera for Surveys (SLACS). We construct axisymmetric dynamical models based on the Jeans equations which allow for orbital anisotropy and include a dark matter halo. The models reproduce in detail the observed Hubble Space Telescope photometry and are constrained by the total projected mass within the Einstein radius and the stellar velocity dispersion (σ) within the Sloan Digital Sky Survey fibres. Comparing the dynamically-derived stellar mass-to-light ratios (M*/L)dyn, obtained for an assumed halo slope ρh ∝ r-1, to the stellar population ones (M*/L)Salp, derived from full-spectrum fitting and assuming a Salpeter IMF, we infer the mass normalization of the IMF. Our results confirm the previous analysis by the SLACS team that the mass normalization of the IMF of high-σ galaxies is consistent on average with a Salpeter slope. Our study allows for a fully consistent study of the trend between IMF and σ for both the SLACS and atlas3D samples, which explore quite different σ ranges. The two samples are highly complementary, the first being essentially σ selected, and the latter volume-limited and nearly mass selected. We find that the two samples merge smoothly into a single trend of the form log α = (0.38 ± 0.04) × log (σe/200 km s-1) + ( - 0.06 ± 0.01), where α = (M*/L)dyn/(M*/L)Salp and σe is the luminosity averaged σ within one effective radius Re. This is consistent with a systematic variation of the IMF normalization from Kroupa to Salpeter in the interval σe ≈ 90-270 km s-1.

  2. Homogeneous Analysis of the Dust Morphology of Transition Disks Observed with ALMA: Investigating Dust Trapping and the Origin of the Cavities

    Science.gov (United States)

    Pinilla, P.; Tazzari, M.; Pascucci, I.; Youdin, A. N.; Garufi, A.; Manara, C. F.; Testi, L.; van der Plas, G.; Barenfeld, S. A.; Canovas, H.; Cox, E. G.; Hendler, N. P.; Pérez, L. M.; van der Marel, N.

    2018-05-01

    We analyze the dust morphology of 29 transition disks (TDs) observed with Atacama Large (sub-)Millimeter Array (ALMA) at (sub-)millimeter emission. We perform the analysis in the visibility plane to characterize the total flux, cavity size, and shape of the ring-like structure. First, we found that the M dust–M ⋆ relation is much flatter for TDs than the observed trends from samples of class II sources in different star-forming regions. This relation demonstrates that cavities open in high (dust) mass disks, independent of the stellar mass. The flatness of this relation contradicts the idea that TDs are a more evolved set of disks. Two potential reasons (not mutually exclusive) may explain this flat relation: the emission is optically thick or/and millimeter-sized particles are trapped in a pressure bump. Second, we discuss our results of the cavity size and ring width in the context of different physical processes for cavity formation. Photoevaporation is an unlikely leading mechanism for the origin of the cavity of any of the targets in the sample. Embedded giant planets or dead zones remain as potential explanations. Although both models predict correlations between the cavity size and the ring shape for different stellar and disk properties, we demonstrate that with the current resolution of the observations, it is difficult to obtain these correlations. Future observations with higher angular resolution observations of TDs with ALMA will help discern between different potential origins of cavities in TDs.

  3. Homogenization scheme for acoustic metamaterials

    KAUST Repository

    Yang, Min

    2014-02-26

    We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.

  4. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    Energy Technology Data Exchange (ETDEWEB)

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  5. Pressure and pressure derivative analysis for vertical gas and oil wells in stress sensitive homogeneous and naturally fractured formations without type-curve matching

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Cantillo, Jose Humberto; Montealegre M, Matilde

    2007-01-01

    Currently, rock mechanics plays an important role in the oil industry. Effects of reservoir subsidence, compaction and dilation are being taken into account in modern reservoir management of complex systems. On the other hand, pressure well tests run in stress sensitive formations ought to be interpreted with non conventional techniques. During the last three decades, several studies relating transient pressure analysis for characterization of stress sensitive reservoirs have been introduced in the literature. Some of them deal with type curves and/or automated history matching. However, due to the nature of the problem, it does not exist a definitive study focused on the adequate characterization of reservoirs which permeability changes as fluid withdrawal advances; in this paper, the permeability modulus concept introduced by Pedroso (1986) is token as the starting basis. A great number of type curves were generated to study the behavior of the above mentioned formations under stress influence. It was found that permeability modulus, therefore permeability changes, can be correlated with the slope of the pressure derivative trend during the radial flow regime when the reservoir suffers compaction. It is also worth to mention that the time of which the minimum characteristic point of a naturally fractured formation (or the inflection point of o semi-log plot) found on the pressure derivative plot is practically the same for formations without stress influence. This contributes to the extension of the TDS technique, Tiab (1993), so a new methodology to characterize this kind of reservoirs is proposed here. This was verified by the solution of synthetic problems

  6. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  7. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  8. Homogeneity Study of UO2 Pellet Density for Quality Control

    International Nuclear Information System (INIS)

    Moon, Je Seon; Park, Chang Je; Kang, Kwon Ho; Moon, Heung Soo; Song, Kee Chan

    2005-01-01

    A homogeneity study has been performed with various densities of UO 2 pellets as the work of a quality control. The densities of the UO 2 pellets are distributed randomly due to several factors such as the milling conditions and sintering environments, etc. After sintering, total fourteen bottles were chosen for UO 2 density and each bottle had three samples. With these bottles, the between-bottle and within-bottle homogeneity were investigated via the analysis of the variance (ANOVA). From the results of ANOVA, the calculated F-value is used to determine whether the distribution is accepted or rejected from the view of a homogeneity under a certain confidence level. All the homogeneity checks followed the International Standard Guide 35

  9. Improving homogeneity by dynamic speed limit systems.

    NARCIS (Netherlands)

    Nes, N. van Brandenberg, S. & Twisk, D.A.M.

    2010-01-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12

  10. 7 CFR 58.636 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.636 Section 58.636 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.636 Homogenization. Homogenization of the pasteurized mix shall be accomplished to...

  11. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  12. Automatic analysis of intrinsic positional verification films brachytherapy using MATLAB; Analisis automatico de peliculas de verificacion posicional intrinsica en braqueterapia mediante MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.

    2011-07-01

    One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)

  13. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  14. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  15. The evaporative vector: Homogeneous systems

    International Nuclear Information System (INIS)

    Klots, C.E.

    1987-05-01

    Molecular beams of van der Waals molecules are the subject of much current research. Among the methods used to form these beams, three-sputtering, laser ablation, and the sonic nozzle expansion of neat gases - yield what are now recognized to be ''warm clusters.'' They contain enough internal energy to undergo a number of first-order processes, in particular that of evaporation. Because of this evaporation and its attendant cooling, the properties of such clusters are time-dependent. The states of matter which can be arrived at via an evaporative vector on a typical laboratory time-scale are discussed. Topics include the (1) temperatures, (2) metastability, (3) phase transitions, (4) kinetic energies of fragmentation, and (5) the expression of magical properties, all for evaporating homogeneous clusters

  16. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  17. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  18. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  19. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  20. Verification of the enrichment of fresh VVER-440 fuel assemblies at NPP Paks

    Energy Technology Data Exchange (ETDEWEB)

    Almasia, I.; Hlavathya, Z.; Nguyena, C. T. [Institute of Isotopes, Hungarian Academy of Sciences, Budapest, (Hungary); others, and

    2012-06-15

    A Non Destructive Analysis (NDA) method was developed for the verification of {sup 235}U enrichment of both homogeneous and profiled VVER-440 reactor fresh fuel assemblies by means of gamma spectrometry. A total of ca. 30 assemblies were tested, five of which were homogeneous, with {sup 235}U enrichment in the range 1,6% to 3,6%, while the others were profiled with pins of 3,3% to 4,4% enrichment. Two types of gamma detectors were used for the test measurements: 2 coaxial HPGe detectors and a miniature CdZnTe (CZT) detector fitting into the central tube of the assemblies. It was therefore possible to obtain information from both the inside and the outside of the assemblies. It was shown that it is possible to distinguish between different types of assemblies within a reasonable measurement time (about 1000 sec). For the HPGe measurements the assemblies had to be lifted out from their storage rack, while for the CZT detector measurements the assemblies could be left at their storage position, as it was shown that the neighbouring assemblies do not affect measurement inside the assemblies' central tube. The measured values were compared to Monte Carlo simulations carried out using the MCNP code, and a recommendation for the optimal approach to verify the {sup 235}U enrichment of fresh VVER-440 reactor fuel assemblies is suggested.

  1. TWO FERROMAGNETIC SPHERES IN HOMOGENEOUS MAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    Yury A. Krasnitsky

    2018-01-01

    Full Text Available The problem of two spherical conductors is studied quite in detail with bispherical coordinates usage and has numerous appendices in an electrostatics. The boundary-value problem about two ferromagnetic spheres enclosed on homogeneous and infinite environment in which the lack of spheres exists like homogeneous magnetic field is considered. The solution of Laplace's equation in the bispherical system of coordinates allows us to find the potential and field distribution in all spaces, including area between spheres. The boundary conditions in potential continuity and in ordinary density constituent of spheres surfaces induction flux are used. It is supposed that spheres are identical, and magnetic permeability of their material is expressed in  >> 0. The problem about falling of electromagnetic plane wave on the system of two spheres, which possesses electrically small sizes, can be considered as quasistationary. The scalar potentials received as a result of Laplace's equation solution are represented by the series containing Legendre polynomials. The concept of two spheres system effective permeability is introduced. It is equal to the advantage in magnitude of magnetic induction flux vector through a certain system’s section arising due to its magnetic properties. Necessary ratios for the effective permeability referred to the central system’s section are obtained. Particularly, the results can be used during the analysis of ferroxcube core clearance, which influences on the magnetic antenna properties. 

  2. Primary healthcare solo practices: homogeneous or heterogeneous?

    Science.gov (United States)

    Pineault, Raynald; Borgès Da Silva, Roxane; Provost, Sylvie; Beaulieu, Marie-Dominique; Boivin, Antoine; Couture, Audrey; Prud'homme, Alexandre

    2014-01-01

    Introduction. Solo practices have generally been viewed as forming a homogeneous group. However, they may differ on many characteristics. The objective of this paper is to identify different forms of solo practice and to determine the extent to which they are associated with patient experience of care. Methods. Two surveys were carried out in two regions of Quebec in 2010: a telephone survey of 9180 respondents from the general population and a postal survey of 606 primary healthcare (PHC) practices. Data from the two surveys were linked through the respondent's usual source of care. A taxonomy of solo practices was constructed (n = 213), using cluster analysis techniques. Bivariate and multilevel analyses were used to determine the relationship of the taxonomy with patient experience of care. Results. Four models were derived from the taxonomy. Practices in the "resourceful networked" model contrast with those of the "resourceless isolated" model to the extent that the experience of care reported by their patients is more favorable. Conclusion. Solo practice is not a homogeneous group. The four models identified have different organizational features and their patients' experience of care also differs. Some models seem to offer a better organizational potential in the context of current reforms.

  3. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  4. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  5. Toward whole-core neutron transport without spatial homogenization

    International Nuclear Information System (INIS)

    Lewis, E. E.

    2009-01-01

    Full text of publication follows: A long-term goal of computational reactor physics is the deterministic analysis of power reactor core neutronics without incurring significant discretization errors in the energy, spatial or angular variables. In principle, given large enough parallel configurations with unlimited CPU time and memory, this goal could be achieved using existing three-dimensional neutron transport codes. In practice, however, solving the Boltzmann equation for neutrons over the six-dimensional phase space is made intractable by the nature of neutron cross-sections and the complexity and size of power reactor cores. Tens of thousands of energy groups would be required for faithful cross section representation. Likewise, the numerous material interfaces present in power reactor lattices require exceedingly fine spatial mesh structures; these ubiquitous interfaces preclude effective implementation of adaptive grid, mesh-less methods and related techniques that have been applied so successfully in other areas of engineering science. These challenges notwithstanding, substantial progress continues in the pursuit for more robust deterministic methods for whole-core neutronics analysis. This paper examines the progress over roughly the last decade, emphasizing the space-angle variables and the quest to eliminate errors attributable to spatial homogenization. As prolog we briefly assess 1990's methods used in light water reactor analysis and review the lessons learned from the C5G7 benchmark exercises which were originated in 1999 to appraise the ability of transport codes to perform core calculations without homogenization. We proceed by examining progress over the last decade much of which falls into three areas. These may be broadly characterized as reduced homogenization, dynamic homogenization and planar-axial synthesis. In the first, homogenization in three-dimensional calculations is reduced from the fuel assembly to the pin-cell level. In the second

  6. Structural Verification of the First Orbital Wonder of the World - The Structural Testing and Analysis of the International Space Station (ISS)

    Science.gov (United States)

    Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond

    2012-01-01

    The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or

  7. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  8. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  9. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  10. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  11. Reciprocity theory of homogeneous reactions

    Science.gov (United States)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  12. Moral Beliefs and Cognitive Homogeneity

    Directory of Open Access Journals (Sweden)

    Nevia Dolcini

    2018-04-01

    Full Text Available The Emotional Perception Model of moral judgment intends to account for experientialism about morality and moral reasoning. In explaining how moral beliefs are formed and applied in practical reasoning, the model attempts to overcome the mismatch between reason and action/desire: morality isn’t about reason for actions, yet moral beliefs, if caused by desires, may play a motivational role in (moral agency. The account allows for two kinds of moral beliefs: genuine moral beliefs, which enjoy a relation to desire, and motivationally inert moral beliefs acquired in ways other than experience. Such etiology-based dichotomy of concepts, I will argue, leads to the undesirable view of cognition as a non-homogeneous phenomenon. Moreover, the distinction between moral beliefs and moral beliefs would entail a further dichotomy encompassing the domain of moral agency: one and the same action might possibly be either genuine moral, or not moral, if acted by individuals lacking the capacity for moral feelings, such as psychopaths.

  13. Homogeneous modes of cosmological instantons

    Energy Technology Data Exchange (ETDEWEB)

    Gratton, Steven; Turok, Neil

    2001-06-15

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or Coleman{endash}De Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe.

  14. Homogeneous modes of cosmological instantons

    International Nuclear Information System (INIS)

    Gratton, Steven; Turok, Neil

    2001-01-01

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or ColemanendashDe Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe

  15. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  16. The structure and homogeneity of Psalm 32

    Directory of Open Access Journals (Sweden)

    J. Henk Potgieter

    2014-11-01

    Full Text Available Psalm 32 is widely regarded as a psalm of thanksgiving with elements of wisdom poetry intermingled into it. The wisdom elements are variously explained as having been present from the beginning, or as having been added to a foundational composition. Such views of the Gattung have had a decisive influence on the interpretation of the psalm. This article argues, on the basis of a structural analysis, that Psalm 32 should be understood as a homogeneous wisdom composition. The parallel and inverse structure of its two stanzas demonstrate that the aim of its author was to encourage the upright to foster an open, intimate relationship with Yahweh in which transgressions are confessed and Yahweh’s benevolent guidance on the way of life is wisely accepted.

  17. Precipitation of plutonium oxalate from homogeneous solutions

    International Nuclear Information System (INIS)

    Rao, V.K.; Pius, I.C.; Subbarao, M.; Chinnusamy, A.; Natarajan, P.R.

    1986-01-01

    A method for the precipitation of plutonium(IV) oxalate from homogeneous solutions using diethyl oxalate is reported. The precipitate obtained is crystalline and easily filterable with yields in the range of 92-98% for precipitations involving a few mg to g quantities of plutonium. Decontamination factors for common impurities such as U(VI), Am(III) and Fe(III) were determined. TGA and chemical analysis of the compound indicate its composition as Pu(Csub(2)Osub(4))sub(2).6Hsub(2)O. Data are obtained on the solubility of the oxalate in nitric acid and in mixtures of nitric acid and oxalic acid of varying concentrations. Green PuOsub(2) obtained by calcination of the oxalate has specifications within the recommended values for trace foreign substances such as chlorine, fluorine, carbon and nitrogen. (author)

  18. Modelling of an homogeneous equilibrium mixture model

    International Nuclear Information System (INIS)

    Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.

    2014-01-01

    We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)

  19. Development of advanced earthquake resistant performance verification on reinforced concrete underground structures. Pt. 3. Applicability of soil-structure Interaction analysis using nonlinear member model

    International Nuclear Information System (INIS)

    Matsui, Jun; Ohtomo, Keizo; Kawai, Tadashi; Kanatani, Mamoru; Matsuo, Toyofumi

    2003-01-01

    The objective of this study is to obtain verification data concerning performance of RC duct-type underground structures subject to strong earth quakes. This paper presents the investigated results of numerical simulation obtained from shaking table tests of box-type structure models with a scale of about 1/2. We proposed practical nonlinear member models, by which mechanical properties of RC member and soil are defined as hysteresis models (RC: axial force dependent degrading tri-linear model, soil: modified Ramberg-Osgood model), and joint elements are used to evaluate the interaction along the interface of two materials between soil and RC structures; including the slippage and separation. Consequently, the proposed models could simulate the test results on the deformation of soil and RC structure, as well as damage of RC structures which is important in verifying their seismic performance with practical accuracy. (author)

  20. Extension and Verification of the Cross-Section Library for the VVER-1000 Surveillance Specimen Region

    International Nuclear Information System (INIS)

    Kirilova, D.; Belousov, S.; Ilieva, K.

    2011-01-01

    The objective of this work is a generation of new version of the BGL multigroup cross-section to extend the region of its applicability. The existing library version is problem oriented for VVER-1000 type of reactors and was generated by collapsing of the VITAMIN-B6 problem independent cross-section fine-group library applying the VVER-1000 reactor middle plane spectrum in cylindrical geometry. The new version BGLex additionally contains cross-sections averaged on the corresponding spectra of the surveillance specimen's (SS) region for VVER-1000 type of reactors. Comparative analysis of the neutron spectra for different one-dimensional geometry models that could be applied for the cross-section collapsing using the software package SCALE, showed a high sensitivity of the results to the geometry model. That is why a neutron importance assessment was done for the SS region using the adjoint solution calculated by the two-dimensional code DORT and problem-independent library VITAMIN-B6. The one-dimensional geometry model applied to the cross-section collapsing were determined by the material limits above the reactor core in axial direction z as for every material a homogenization in radial direction was done. The material homogenization in radial direction was done by material weighing taking into account the adjoint solution as well as the neutron source. The one-dimensional geometry model comprising the homogenized weighed materials was applied for the cross-section generation of the fine-group library VITAMIN-B6 to the broad-group structure of BGL library. The new version BGLex was extended with cross-sections for the SS region. Verification and validation of the new version BGLex is forthcoming. It includes comparison between the calculated results with the new version BGLex and the libraries BGL and VITAMIN-B6 and comparison with experimental results. (author)

  1. Extension and Verification of the Cross-Section Library for the VVER- 1000 Surveillance Specimen Region

    International Nuclear Information System (INIS)

    Kirilova, D.; Belousov, S.; Ilieva, K.

    2011-01-01

    The objective of this work is a generation of new version of the BGL multigroup cross-section to extend the region of its applicability. The existing library version is problem oriented for VVER-1000 type of reactors and was generated by collapsing of the VITAMIN-B6 problem independent cross-section fine-group library applying the VVER-1000 reactor middle plane spectrum in cylindrical geometry. The new version BGLex additionally contains cross-sections averaged on the corresponding spectra of the surveillance specimen's (SS) region for VVER-1000 type of reactors. Comparative analysis of the neutron spectra for different one-dimensional geometry models that could be applied for the cross-section collapsing using the software package SCALE, showed a high sensitivity of the results to the geometry model. That is why a neutron importance assessment was done for the SS region using the adjoint solution calculated by the two-dimensional code DORT and problem-independent library VITAMIN-B6. The one-dimensional geometry model applied to the cross-section collapsing were determined by the material limits above the reactor core in axial direction z as for every material a homogenization in radial direction was done. The material homogenization in radial direction was done by material weighing taking into account the adjoint solution as well as the neutron source. The one-dimensional geometry model comprising the homogenized weighed materials was applied for the cross-section generation of the fine-group library VITAMIN-B6 to the broad-group structure of BGL library. The new version BGLex was extended with cross-sections for the SS region. Verification and validation of the new version BGLex is forthcoming. It includes comparison between the calculated results with the new version BGLex and the libraries BGL and VITAMIN-B6 and comparison with experimental results. (author)

  2. Homogeneity and thermodynamic identities in geometrothermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)

    2017-03-15

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)

  3. A literature review on biotic homogenization

    OpenAIRE

    Guangmei Wang; Jingcheng Yang; Chuangdao Jiang; Hongtao Zhao; Zhidong Zhang

    2009-01-01

    Biotic homogenization is the process whereby the genetic, taxonomic and functional similarity of two or more biotas increases over time. As a new research agenda for conservation biogeography, biotic homogenization has become a rapidly emerging topic of interest in ecology and evolution over the past decade. However, research on this topic is rare in China. Herein, we introduce the development of the concept of biotic homogenization, and then discuss methods to quantify its three components (...

  4. Hybrid diffusion–transport spatial homogenization method

    International Nuclear Information System (INIS)

    Kooreman, Gabriel; Rahnema, Farzad

    2014-01-01

    Highlights: • A new hybrid diffusion–transport homogenization method. • An extension of the consistent spatial homogenization (CSH) transport method. • Auxiliary cross section makes homogenized diffusion consistent with heterogeneous diffusion. • An on-the-fly re-homogenization in transport. • The method is faster than fine-mesh transport by 6–8 times. - Abstract: A new hybrid diffusion–transport homogenization method has been developed by extending the consistent spatial homogenization (CSH) transport method to include diffusion theory. As in the CSH method, an “auxiliary cross section” term is introduced into the source term, making the resulting homogenized diffusion equation consistent with its heterogeneous counterpart. The method then utilizes an on-the-fly re-homogenization in transport theory at the assembly level in order to correct for core environment effects on the homogenized cross sections and the auxiliary cross section. The method has been derived in general geometry and tested in a 1-D boiling water reactor (BWR) core benchmark problem for both controlled and uncontrolled configurations. The method has been shown to converge to the reference solution with less than 1.7% average flux error in less than one third the computational time as the CSH method – 6 to 8 times faster than fine-mesh transport

  5. Proceeding of the workshop on the results of the cooperative research between JAERI and CHESCIR concerning the study on assessment and analysis of environmental radiological consequences and verification of an assessment system

    Energy Technology Data Exchange (ETDEWEB)

    Amano, Hikaru; Saito, Kimiaki (eds.) [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    This workshop was organized and sponsored by the Japan Atomic Energy Research Institute (JAERI) and Chernobyl Science and Technology Center for International Research (CHESCIR). JAERI and CHESCIR have conducted 8 years research cooperation from 1992 to 1999 concerning the study on assessment and analysis of environmental radiological consequences and verification of an assessment system, focusing on the Chernobyl contaminated area. It contained 3 research subjects. Subject-1 initiated at 1992 and focused the study on measurements and evaluation of environmental external exposure after nuclear accident. Subject-2 initiated at 1992 and focused the study on the validation of assessment models in an environmental consequence assessment methodology for nuclear accidents. Subject-3 initiated at 1995 and focused on the study on migration of radionuclides released into terrestrial and aquatic environment after nuclear accidents. This workshop was held to summarize the research cooperation between JAERI and CHESCIR, and to discuss future research needs in this field. (author)

  6. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  7. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Science.gov (United States)

    2010-01-01

    .... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  8. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  9. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  10. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  11. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  12. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  13. Self-consolidating concrete homogeneity

    Directory of Open Access Journals (Sweden)

    Jarque, J. C.

    2007-08-01

    Full Text Available Concrete instability may lead to the non-uniform distribution of its properties. The homogeneity of self-consolidating concrete in vertically cast members was therefore explored in this study, analyzing both resistance to segregation and pore structure uniformity. To this end, two series of concretes were prepared, self-consolidating and traditional vibrated materials, with different w/c ratios and types of cement. The results showed that selfconsolidating concretes exhibit high resistance to segregation, albeit slightly lower than found in the traditional mixtures. The pore structure in the former, however, tended to be slightly more uniform, probably as a result of less intense bleeding. Such concretes are also characterized by greater bulk density, lower porosity and smaller mean pore size, which translates into a higher resistance to pressurized water. For pore diameters of over about 0.5 μm, however, the pore size distribution was found to be similar to the distribution in traditional concretes, with similar absorption rates.En este trabajo se estudia la homogeneidad de los hormigones autocompactantes en piezas hormigonadas verticalmente, determinando su resistencia a la segregación y la uniformidad de su estructura porosa, dado que la pérdida de estabilidad de una mezcla puede conducir a una distribución no uniforme de sus propiedades. Para ello se han fabricado dos tipos de hormigones, uno autocompactante y otro tradicional vibrado, con diferentes relaciones a/c y distintos tipos de cemento. Los resultados ponen de manifiesto que los hormigones autocompactantes presentan una buena resistencia a la segregación, aunque algo menor que la registrada en los hormigones tradicionales. A pesar de ello, su estructura porosa tiende a ser ligeramente más uniforme, debido probablemente a un menor sangrado. Asimismo, presentan una mayor densidad aparente, una menor porosidad y un menor tamaño medio de poro, lo que les confiere mejores

  14. Proposition for a logical formalism of expression and of temporal properties verification on repetitive events devoted to the analysis of execution traces; Proposition pour un formalisme logique d`expression et de verification de proprietes temporelles sur des evenements repetitifs destine a l`analyse de traces d`execution

    Energy Technology Data Exchange (ETDEWEB)

    Le Campion, J.M.

    1996-01-30

    To provide safety analysis of complex real time systems which have been developed for the protection of French nuclear Plants, the CEA is interested in software testing validation techniques. These series of tests are made by a purely software simulation of the system. The purpose is to establish the truth of some critical properties of the programs either at the simulation run time or after its execution. The operator is able to describe the variation of some inputs parameters of the programs and shows the results with graphics facilities. An important need was to describe formally some categories of properties expressed in terms of academic examples. We thought that a logical textual language was appropriate to achieve this formal expression. This thesis describe a new data-flow language called EFRI extending the semantic of interval temporal logics. Then we describe a calculus using regular languages on arrays which associates to each formula of the EFRI language a regular expression. With this method, the verification of a property described by a formula of EFRI can be viewed as a classical problem of languages theory: does a word belongs to a regular language. We can then build a finite automaton to recognize complex temporal diagrams. (author). 38 refs., 7 tabs., 4 appends.

  15. Homogeneity of Moral Judgment? Apprentices Solving Business Conflicts.

    Science.gov (United States)

    Beck, Klaus; Heinrichs, Karin; Minnameier, Gerhard; Parche-Kawik, Kirsten

    In an ongoing longitudinal study that started in 1994, the moral development of business apprentices is being studied. The focal point of this project is a critical analysis of L. Kohlberg's thesis of homogeneity, according to which people should judge every moral issue from the point of view of their "modal" stage (the most frequently…

  16. Homogeneous axisymmetric model with a limitting stiff equation of state

    International Nuclear Information System (INIS)

    Korkina, M.P.; Martynenko, V.G.

    1976-01-01

    A solution is obtained for Einstein's equations in which all metric coefficients are time functions for a limiting stiff equation of the substance state. Thr solution describes a homogeneous cosmological model with cylindrical symmetry. It is shown that the same metrics can be induced by a massless scalar only time-dependent field. Analysis of this solution is presented

  17. Early capillary flux homogenization in response to neural activation.

    Science.gov (United States)

    Lee, Jonghwan; Wu, Weicheng; Boas, David A

    2016-02-01

    This Brief Communication reports early homogenization of capillary network flow during somatosensory activation in the rat cerebral cortex. We used optical coherence tomography and statistical intensity variation analysis for tracing changes in the red blood cell flux over hundreds of capillaries nearly at the same time with 1-s resolution. We observed that while the mean capillary flux exhibited a typical increase during activation, the standard deviation of the capillary flux exhibited an early decrease that happened before the mean flux increase. This network-level data is consistent with the theoretical hypothesis that capillary flow homogenizes during activation to improve oxygen delivery. © The Author(s) 2015.

  18. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  19. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  20. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion