WorldWideScience

Sample records for homogenization methods physical

  1. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    Science.gov (United States)

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  2. DNA Dynamics Studied Using the Homogeneous Balance Method

    International Nuclear Information System (INIS)

    Zayed, E. M. E.; Arnous, A. H.

    2012-01-01

    We employ the homogeneous balance method to construct the traveling waves of the nonlinear vibrational dynamics modeling of DNA. Some new explicit forms of traveling waves are given. It is shown that this method provides us with a powerful mathematical tool for solving nonlinear evolution equations in mathematical physics. Strengths and weaknesses of the proposed method are discussed. (general)

  3. Physical applications of homogeneous balls

    CERN Document Server

    Scarr, Tzvi

    2005-01-01

    One of the mathematical challenges of modern physics lies in the development of new tools to efficiently describe different branches of physics within one mathematical framework. This text introduces precisely such a broad mathematical model, one that gives a clear geometric expression of the symmetry of physical laws and is entirely determined by that symmetry. The first three chapters discuss the occurrence of bounded symmetric domains (BSDs) or homogeneous balls and their algebraic structure in physics. The book further provides a discussion of how to obtain a triple algebraic structure ass

  4. Hybrid diffusion–transport spatial homogenization method

    International Nuclear Information System (INIS)

    Kooreman, Gabriel; Rahnema, Farzad

    2014-01-01

    Highlights: • A new hybrid diffusion–transport homogenization method. • An extension of the consistent spatial homogenization (CSH) transport method. • Auxiliary cross section makes homogenized diffusion consistent with heterogeneous diffusion. • An on-the-fly re-homogenization in transport. • The method is faster than fine-mesh transport by 6–8 times. - Abstract: A new hybrid diffusion–transport homogenization method has been developed by extending the consistent spatial homogenization (CSH) transport method to include diffusion theory. As in the CSH method, an “auxiliary cross section” term is introduced into the source term, making the resulting homogenized diffusion equation consistent with its heterogeneous counterpart. The method then utilizes an on-the-fly re-homogenization in transport theory at the assembly level in order to correct for core environment effects on the homogenized cross sections and the auxiliary cross section. The method has been derived in general geometry and tested in a 1-D boiling water reactor (BWR) core benchmark problem for both controlled and uncontrolled configurations. The method has been shown to converge to the reference solution with less than 1.7% average flux error in less than one third the computational time as the CSH method – 6 to 8 times faster than fine-mesh transport

  5. Research on reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Ye Zhimin; Zhang Peng

    2014-01-01

    In order to meet the demand of nuclear energy market in the future, many new concepts of nuclear energy systems has been put forward. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multigroup cross section libraries. Due to its strong geometry modeling capability and the application of continuous energy cross section libraries, the Monte Carlo method has been widely used in reactor physics calculations, and more and more researches on Monte Carlo method has been carried out. Neutronics-thermal hydraulics coupling analysis based on Monte Carlo method has been realized. However, it still faces the problems of long computation time and slow convergence which make it not applicable to the reactor core fuel management simulations. Drawn from the deterministic core analysis method, a new two-step core analysis scheme is proposed in this work. Firstly, Monte Carlo simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Secondly, the core diffusion calculations can be done with these multigroup cross sections. The new scheme can achieve high efficiency while maintain acceptable precision, so it can be used as an effective tool for the design and analysis of innovative nuclear energy systems. Numeric tests have been done in this work to verify the new scheme. (authors)

  6. The SPH homogeneization method

    International Nuclear Information System (INIS)

    Kavenoky, Alain

    1978-01-01

    The homogeneization of a uniform lattice is a rather well understood topic while difficult problems arise if the lattice becomes irregular. The SPH homogeneization method is an attempt to generate homogeneized cross sections for an irregular lattice. Section 1 summarizes the treatment of an isolated cylindrical cell with an entering surface current (in one velocity theory); Section 2 is devoted to the extension of the SPH method to assembly problems. Finally Section 3 presents the generalisation to general multigroup problems. Numerical results are obtained for a PXR rod bundle assembly in Section 4

  7. Study on critical effect in lattice homogenization via Monte Carlo method

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    In contrast to the traditional deterministic lattice codes, generating the homogenization multigroup constants via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum. thus provides more accuracy parameters. An infinite lattice of identical symmetric motives is usually assumed when performing the homogenization. However, the finite size of a reactor is reality and it should influence the lattice calculation. In practice of the homogenization with Monte Carlo method, B N theory is applied to take the leakage effect into account. The fundamental mode with the buckling B is used as a measure of the finite size. The critical spectrum in the solution of 0-dimensional fine-group B 1 equations is used to correct the weighted spectrum for homogenization. A PWR prototype core is examined to verify that the presented method indeed generates few group constants effectively. In addition, a zero power physical experiment verification is performed. The results show that B N theory is adequate for leakage correction in the multigroup constants generation via Monte Carlo method. (authors)

  8. Comparison of cell homogenization methods considering interaction effect between fuel cells and control rod cells

    International Nuclear Information System (INIS)

    Takeda, T.; Uto, N.

    1988-01-01

    Several methods to determine cell-averaged group cross sections and anisotropic diffusion coefficients which consider the interaction effect between core fuel cells and control rods or control rod followers have been compared to discuss the physical meaning included in cell homogenization. As the cell homogenization methods considered are the commonly used flux-weighting method, the reaction rate preservation method and the reactivity preservation method. These homogenization methods have been applied to control rod worth calculations in 1-D slab cores to investigate their applicability. (author). 6 refs, 2 figs, 9 tabs

  9. Developing a multi-physics solver in APOLLO3 and applications to cross section homogenization

    International Nuclear Information System (INIS)

    Dugan, Kevin-James

    2016-01-01

    Multi-physics coupling is becoming of large interest in the nuclear engineering and computational science fields. The ability to obtain accurate solutions to realistic models is important to the design and licensing of novel reactor designs, especially in design basis accident situations. The physical models involved in calculating accident behavior in nuclear reactors includes: neutron transport, thermal conduction/convection, thermo-mechanics in fuel and support structure, fuel stoichiometry, among others. However, this thesis focuses on the coupling between two models, neutron transport and thermal conduction/convection.The goal of this thesis is to develop a multi-physics solver for simulating accidents in nuclear reactors. The focus is both on the simulation environment and the data treatment used in such simulations.This work discusses the development of a multi-physics framework based around the Jacobian-Free Newton-Krylov (JFNK) method. The framework includes linear and nonlinear solvers, along with interfaces to existing numerical codes that solve neutron transport and thermal hydraulics models (APOLLO3 and MCTH respectively) through the computation of residuals. a new formulation for the neutron transport residual is explored, which reduces the solution size and search space by a large factor; instead of the residual being based on the angular flux, it is based on the fission source.The question of whether using a fundamental mode distribution of the neutron flux for cross section homogenization is sufficiently accurate during fast transients is also explored. It is shown that in an infinite homogeneous medium, using homogenized cross sections produced with a fundamental mode flux differ significantly from a reference solution. The error is remedied by using an alternative weighting flux taken from a time dependent calculation; either a time-integrated flux or an asymptotic solution. The time-integrated flux comes from the multi-physics solution of the

  10. A Modified Homogeneous Balance Method and Its Applications

    International Nuclear Information System (INIS)

    Liu Chunping

    2011-01-01

    A modified homogeneous balance method is proposed by improving some key steps in the homogeneous balance method. Bilinear equations of some nonlinear evolution equations are derived by using the modified homogeneous balance method. Generalized Boussinesq equation, KP equation, and mKdV equation are chosen as examples to illustrate our method. This approach is also applicable to a large variety of nonlinear evolution equations. (general)

  11. Study of titanium nitride elasticity characteristics in the homogeneity range by ultrasonic resonance method

    International Nuclear Information System (INIS)

    Khidirov, I.; Khajdarov, T.

    1995-01-01

    Elasticity characteristics of cubic and tetragonal phases of titanium nitride in the homogeneity range were studied for the first time by ultrasonic resonance method. It is established that the Young modulus, the shift and volume module of cubic titanium nitride elasticity in the homogeneity range change nonlinearly with decrease in nitrogen concentration and correlate with concentration dependences of other physical properties.15 refs., 2 figs

  12. Continuous energy Monte Carlo method based homogenization multi-group constants calculation

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    The efficiency of the standard two-step reactor physics calculation relies on the accuracy of multi-group constants from the assembly-level homogenization process. In contrast to the traditional deterministic methods, generating the homogenization cross sections via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data bank can be used for a wide range of applications, resulting in the versatility using Monte Carlo codes for homogenization. As the first stage to realize Monte Carlo based lattice homogenization, the track length scheme is used as the foundation of cross section generation, which is straight forward. The scattering matrix and Legendre components, however, require special techniques. The Scattering Event method was proposed to solve the problem. There are no continuous energy counterparts in the Monte Carlo calculation for neutron diffusion coefficients. P 1 cross sections were used to calculate the diffusion coefficients for diffusion reactor simulator codes. B N theory is applied to take the leakage effect into account when the infinite lattice of identical symmetric motives is assumed. The MCMC code was developed and the code was applied in four assembly configurations to assess the accuracy and the applicability. At core-level, A PWR prototype core is examined. The results show that the Monte Carlo based multi-group constants behave well in average. The method could be applied to complicated configuration nuclear reactor core to gain higher accuracy. (authors)

  13. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    KAUST Repository

    Gao, Kai

    2015-06-05

    The development of reliable methods for upscaling fine-scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. Therefore, we have proposed a numerical homogenization algorithm based on multiscale finite-element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that was similar to the rotated staggered-grid finite-difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity in which the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  14. Spatial homogenization method based on the inverse problem

    International Nuclear Information System (INIS)

    Tóta, Ádám; Makai, Mihály

    2015-01-01

    Highlights: • We derive a spatial homogenization method in slab and cylindrical geometries. • The fluxes and the currents on the boundary are preserved. • The reaction rates and the integral of the fluxes are preserved. • We present verification computations utilizing two- and four-energy groups. - Abstract: We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region’s multi-group cross sections; providing that the fluxes, the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved. We consider one-dimensional geometries: a symmetric slab and a homogeneous cylinder. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined concerning the current and the flux integral. The first one derives the boundary currents from the boundary fluxes, while the second one derives the flux integrals from the boundary fluxes. Further RMs can be defined that connects reaction rates to the boundary fluxes. Assuming that these matrices are known, we present formulae that reconstruct the multi-group diffusion cross-section matrix, the diffusion coefficients and the reaction cross sections in case of one-dimensional (1D) homogeneous regions. We apply these formulae to 1D heterogeneous regions and thus obtain a homogenization method. This method produces such an equivalent homogeneous material, that the fluxes and the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved for any boundary fluxes. We carry out the exact derivations in 1D slab and cylindrical geometries. Verification computations for the presented homogenization method were performed using two- and four-group material cross sections, both in a slab and in a cylindrical geometry

  15. On some methods in homogenization and their applications

    International Nuclear Information System (INIS)

    Allaire, Gregoire

    1993-01-01

    This report (which reproduces an 'Habilitation' thesis) is concerned with the homogenization theory which can be defined as the union of all mathematical techniques allowing to pass from a microscopic behavior to a macroscopic (or averaged, or effective) behavior of a physical phenomenon, modeled by one or several partial differential equations. Some new results are discussed, both from the point of view of methods, and from that of applications. The first chapter deals with viscous incompressible fluid flows in porous media, and, in particular, contains a derivation of Darcy and Brinkman's law. The second chapter is dedicated to the two-scale convergence method. The third chapter focus on the problem of optimal bounds for the effective properties of composite materials. Finally, in the fourth chapter the previous results are applied to the optimal design problem for elastic shapes. (author) [fr

  16. Layout optimization using the homogenization method

    Science.gov (United States)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  17. Computational Method for Atomistic-Continuum Homogenization

    National Research Council Canada - National Science Library

    Chung, Peter

    2002-01-01

    The homogenization method is used as a framework for developing a multiscale system of equations involving atoms at zero temperature at the small scale and continuum mechanics at the very large scale...

  18. Homogeneity of Inorganic Glasses

    DEFF Research Database (Denmark)

    Jensen, Martin; Zhang, L.; Keding, Ralf

    2011-01-01

    Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...

  19. Modelling of the dynamical behaviour of LWR internals by homogeneization methods

    International Nuclear Information System (INIS)

    Brochard, D.; Lepareux, M.; Gibert, R.J.; Delaigue, D.; Planchard, J.

    1987-01-01

    The upper plenum of the internals of PWR, the steam generator bundle, the nuclear reactor core, may be schematically represented by a beam bundle immersed in a fluid. The dynamical study of such a system needs to take into account fluid structure interaction. A refined modelisation at the scale of the tubes can be used but leads to a very important size of problem difficult to solve even on the biggest computers. The homogeneization method allows to have an approximation of the fluid structure interaction for the global behaviour of the bundle. It consists in replacing the heterogeneous physical medium (tubes and fluid) by an equivalent homogeneous medium whose characteristics are determined from the resolution of a set of problems on the elementary cell. The aim of this paper is to present the main steps of the determination of this equivalent medium in the case of small displacements (acoustic behaviour of the fluid) and in using displacement variables for both fluid and tubes. Then some precisions about the implementation of this method in computer codes will be given. (orig.)

  20. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    Energy Technology Data Exchange (ETDEWEB)

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  1. A homogenization method for the analysis of a coupled fluid-structure interaction problem with inner solid structure

    International Nuclear Information System (INIS)

    Sigrist, Jean-Francois; Laine, Christian; Broc, Daniel

    2006-01-01

    The present paper exposes a homogenization method developed in order to perform the seismic analysis of a nuclear reactor with internal structures modelling and taking fluid structure interaction effects into account. The numerical resolution of fluid-structure interactions has made tremendous progress over the past decades and some applications of the various developed techniques in the industrial field can be found in the literature. As builder of nuclear naval propulsion reactors (ground prototype reactor or embarked reactor on submarines), DCN Propulsion has been working with French nuclear committee CEA for several years in order to integrate fluid-structure analysis in the design stage of current projects. In previous papers modal and seismic analyses of a nuclear reactor with fluid-structure interaction effect were exposed. The studies highlighted the importance of fluid- structure coupling phenomena in the industrial case and focussed on added mass and added stiffness effects. The numerical model used in the previous studies did not take into account the presence of internal structures within the pressure vessel. The present study aims at improving the numerical model of the nuclear reactor to take into account the presence of the internal structures. As the internal structures are periodical within the inner and outer structure of the pressure vessel the proposed model is based on the development of a homogenization method: the presence of internal structure and its effect on the fluid-structure physical interaction is taken into account, although they are not geometrically modeled. The basic theory of the proposed homogenization method is recalled, leading to the modification of fluid-structure coupling operator in the finite element model. The physical consistency of the method is proved by an evaluation of the system mass with the various mass operators (structure, fluid and fluid-structure operators). The method is exposed and validated in a 2 D case

  2. Lambda-Cyhalothrin Nanosuspension Prepared by the Melt Emulsification-High Pressure Homogenization Method

    Directory of Open Access Journals (Sweden)

    Zhenzhong Pan

    2015-01-01

    Full Text Available The nanosuspension of 5% lambda-cyhalothrin with 0.2% surfactants was prepared by the melt emulsification-high pressure homogenization method. The surfactants composition, content, and homogenization process were optimized. The anionic surfactant (1-dodecanesulfonic acid sodium salt and polymeric surfactant (maleic rosin-polyoxypropylene-polyoxyethylene ether sulfonate screened from 12 types of commercially common-used surfactants were used to prepare lambda-cyhalothrin nanosuspension with high dispersity and stability. The mean particle size and polydispersity index of the nanosuspension were 16.01 ± 0.11 nm and 0.266 ± 0.002, respectively. The high zeta potential value of −41.7 ± 1.3 mV and stable crystalline state of the nanoparticles indicated the excellent physical and chemical stability. The method could be widely used for preparing nanosuspension of various pesticides with melting points below boiling point of water. This formulation may avoid the use of organic solvents and reduce surfactants and is perspective for improving bioavailability and reducing residual pollution of pesticide in agricultural products and environment.

  3. Investigation of methods for hydroclimatic data homogenization

    Science.gov (United States)

    Steirou, E.; Koutsoyiannis, D.

    2012-04-01

    We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, 'SNHT for single shifts', was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

  4. Cell homogenization methods for pin-by-pin core calculations tested in slab geometry

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Kitamura, Yasunori; Yamane, Yoshihiro

    2004-01-01

    In this paper, performances of spatial homogenization methods for fuel or non-fuel cells are compared in slab geometry in order to facilitate pin-by-pin core calculations. Since the spatial homogenization methods were mainly developed for fuel assemblies, systematic study of their performance for the cell-level homogenization has not been carried out. Importance of cell-level homogenization is recently increasing since the pin-by-pin mesh core calculation in actual three-dimensional geometry, which is less approximate approach than current advanced nodal method, is getting feasible. Four homogenization methods were investigated in this paper; the flux-volume weighting, the generalized equivalence theory, the superhomogenization (SPH) method and the nonlinear iteration method. The last one, the nonlinear iteration method, was tested as the homogenization method for the first time. The calculations were carried out in simplified colorset assembly configurations of PWR, which are simulated by slab geometries, and homogenization performances were evaluated through comparison with the reference cell-heterogeneous calculations. The calculation results revealed that the generalized equivalence theory showed best performance. Though the nonlinear iteration method can significantly reduce homogenization error, its performance was not as good as that of the generalized equivalence theory. Through comparison of the results obtained by the generalized equivalence theory and the superhomogenization method, important byproduct was obtained; deficiency of the current superhomogenization method, which could be improved by incorporating the 'cell-level discontinuity factor between assemblies', was clarified

  5. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  6. Applications of a systematic homogenization theory for nodal diffusion methods

    International Nuclear Information System (INIS)

    Zhang, Hong-bin; Dorning, J.J.

    1992-01-01

    The authors recently have developed a self-consistent and systematic lattice cell and fuel bundle homogenization theory based on a multiple spatial scales asymptotic expansion of the transport equation in the ratio of the mean free path to the reactor characteristics dimension for use with nodal diffusion methods. The mathematical development leads naturally to self-consistent analytical expressions for homogenized diffusion coefficients and cross sections and flux discontinuity factors to be used in nodal diffusion calculations. The expressions for the homogenized nuclear parameters that follow from the systematic homogenization theory (SHT) are different from those for the traditional flux and volume-weighted (FVW) parameters. The calculations summarized here show that the systematic homogenization theory developed recently for nodal diffusion methods yields accurate values for k eff and assembly powers even when compared with the results of a fine mesh transport calculation. Thus, it provides a practical alternative to equivalence theory and GET (Ref. 3) and to simplified equivalence theory, which requires auxiliary fine-mesh calculations for assemblies embedded in a typical environment to determine the discontinuity factors and the equivalent diffusion coefficient for a homogenized assembly

  7. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  8. An homogeneization method applied to the seismic analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Brochard, D.; Hammami, L.

    1991-01-01

    Important structures like nuclear reactor cores, steam generator bundle, are schematically composed by a great number of beams, immersed in a fluid. The fluid structure interaction is an important phenomenon influencing the dynamical response of bundle. The study of this interaction through classical methods would need a refined modelisation at the scale of the beams and lead to important size of problems. The homogeneization method constitutes an alternative approach if we are mainly interested by the global behaviour of the bundle. Similar approaches have been already used for other types of industrial structures (Sanchez-Palencia 1980, Bergman and al. 1985, Theodory 1984, Benner and al. 1981). This method consists in replacing the physical heterogeneous medium by an homogeneous medium, which characteristics are determined from the resolution of a set problems on the elementary cell. In the first part of this paper the main assumptions of the method will be summarized. Moreover, other important phenomena may contribute to the dynamical behaviour of the industrial above mentioned structures: those are the impacts between the beams. These impacts could be due to supports limiting the displacements of the beams or to differences in the vibratory characteristics of the various beams. The second part of the paper will concern the way of taking into account the impacts in the linear hemogeneous formalism. Finally an application to the seismic analysis of the FBR core mock-up RAPSODIE will be presented

  9. Uncertainty measurement in the homogenization and sample reduction in the physical classification of rice and beans

    Directory of Open Access Journals (Sweden)

    Dieisson Pivoto

    2016-04-01

    Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.

  10. Methods study of homogeneity and stability test from cerium oxide CRM candidate

    International Nuclear Information System (INIS)

    Samin; Susanna TS

    2016-01-01

    The methods study of homogeneity and stability test from cerium oxide CRM candidate has been studied based on ISO 13258 and KAN DP. 01. 34. The purpose of this study was to select the test method homogeneity and stability tough on making CRM cerium oxide. Prepared 10 sub samples of cerium oxide randomly selected types of analytes which represent two compounds, namely CeO_2 and La_2O_3. At 10 sub sample is analyzed CeO_2 and La_2O_3 contents in duplicate with the same analytical methods, by the same analyst, and in the same laboratory. Data analysis results calculated statistically based on ISO 13528 and KAN DP.01.34. According to ISO 13528 Cerium Oxide samples said to be homogeneous if Ss ≤ 0.3 σ and is stable if | Xr – Yr | ≤ 0.3 σ. In this study, the data of homogeneity test obtained CeO_2 is Ss = 2.073 x 10-4 smaller than 0.3 σ (0.5476) and the stability test obtained | Xr - Yr | = 0.225 and the price is < 0.3 σ. Whereas for La_2O_3, the price for homogeneity test obtained Ss = 1.649 x 10-4 smaller than 0.3 σ (0.4865) and test the stability of the price obtained | Xr - Yr | = 0.2185 where the price is < 0.3 σ. Compared with the method from KAN, a sample of cerium oxide has also been homogenized for Fcalc < Ftable and stable, because | Xi - Xhm | < 0.3 x n IQR. Provided that the results of the evaluation homogeneity and stability test from CeO_2 CRM candidate test data were processed using statistical methods ISO 13528 is not significantly different with statistical methods from KAN DP.01.34, which together meet the requirements of a homogeneous and stable. So the test method homogeneity and stability test based on ISO 13528 can be used to make CRM cerium oxide. (author)

  11. Homogenized description and retrieval method of nonlinear metasurfaces

    Science.gov (United States)

    Liu, Xiaojun; Larouche, Stéphane; Smith, David R.

    2018-03-01

    A patterned, plasmonic metasurface can strongly scatter incident light, functioning as an extremely low-profile lens, filter, reflector or other optical device. When the metasurface is patterned uniformly, its linear optical properties can be expressed using effective surface electric and magnetic polarizabilities obtained through a homogenization procedure. The homogenized description of a nonlinear metasurface, however, presents challenges both because of the inherent anisotropy of the medium as well as the much larger set of potential wave interactions available, making it challenging to assign effective nonlinear parameters to the otherwise inhomogeneous layer of metamaterial elements. Here we show that a homogenization procedure can be developed to describe nonlinear metasurfaces, which derive their nonlinear response from the enhanced local fields arising within the structured plasmonic elements. With the proposed homogenization procedure, we are able to assign effective nonlinear surface polarization densities to a nonlinear metasurface, and link these densities to the effective nonlinear surface susceptibilities and averaged macroscopic pumping fields across the metasurface. These effective nonlinear surface polarization densities are further linked to macroscopic nonlinear fields through the generalized sheet transition conditions (GSTCs). By inverting the GSTCs, the effective nonlinear surface susceptibilities of the metasurfaces can be solved for, leading to a generalized retrieval method for nonlinear metasurfaces. The application of the homogenization procedure and the GSTCs are demonstrated by retrieving the nonlinear susceptibilities of a SiO2 nonlinear slab. As an example, we investigate a nonlinear metasurface which presents nonlinear magnetoelectric coupling in near infrared regime. The method is expected to apply to any patterned metasurface whose thickness is much smaller than the wavelengths of operation, with inclusions of arbitrary geometry

  12. [Methods for enzymatic determination of triglycerides in liver homogenates].

    Science.gov (United States)

    Höhn, H; Gartzke, J; Burck, D

    1987-10-01

    An enzymatic method is described for the determination of triacylglycerols in liver homogenate. In contrast to usual methods, higher reliability and selectivity are achieved by omitting the extraction step.

  13. Hydrogen storage materials and method of making by dry homogenation

    Science.gov (United States)

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  14. Core homogenization method for pebble bed reactors

    International Nuclear Information System (INIS)

    Kulik, V.; Sanchez, R.

    2005-01-01

    This work presents a core homogenization scheme for treating a stochastic pebble bed loading in pebble bed reactors. The reactor core is decomposed into macro-domains that contain several pebble types characterized by different degrees of burnup. A stochastic description is introduced to account for pebble-to-pebble and pebble-to-helium interactions within a macro-domain as well as for interactions between macro-domains. Performance of the proposed method is tested for the PROTEUS and ASTRA critical reactor facilities. Numerical simulations accomplished with the APOLLO2 transport lattice code show good agreement with the experimental data for the PROTEUS reactor facility and with the TRIPOLI4 Monte Carlo simulations for the ASTRA reactor configuration. The difference between the proposed method and the traditional volume-averaged homogenization technique is negligible while only one type of fuel pebbles present in the system, but it grows rapidly with the level of pebble heterogeneity. (authors)

  15. Homogenization methods for heterogeneous assemblies

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1980-01-01

    The third session of the IAEA Technical Committee Meeting is concerned with the problem of homogenization of heterogeneous assemblies. Six papers will be presented on the theory of homogenization and on practical procedures for deriving homogenized group cross sections and diffusion coefficients. That the problem of finding so-called ''equivalent'' diffusion theory parameters for the use in global reactor calculations is of great practical importance. In spite of this, it is fair to say that the present state of the theory of second homogenization is far from being satisfactory. In fact, there is not even a uniquely accepted approach to the problem of deriving equivalent group diffusion parameters. Common agreement exists only about the fact that the conventional flux-weighting technique provides only a first approximation, which might lead to acceptable results in certain cases, but certainly does not guarantee the basic requirement of conservation of reaction rates

  16. A second stage homogenization method

    International Nuclear Information System (INIS)

    Makai, M.

    1981-01-01

    A second homogenization is needed before the diffusion calculation of the core of large reactors. Such a second stage homogenization is outlined here. Our starting point is the Floquet theorem for it states that the diffusion equation for a periodic core always has a particular solution of the form esup(j)sup(B)sup(x) u (x). It is pointed out that the perturbation series expansion of function u can be derived by solving eigenvalue problems and the eigenvalues serve to define homogenized cross sections. With the help of these eigenvalues a homogenized diffusion equation can be derived the solution of which is cos Bx, the macroflux. It is shown that the flux can be expressed as a series of buckling. The leading term in this series is the well known Wigner-Seitz formula. Finally three examples are given: periodic absorption, a cell with an absorber pin in the cell centre, and a cell of three regions. (orig.)

  17. Fluid structure interaction in LMFBR cores modelling by an homogenization method

    International Nuclear Information System (INIS)

    Brochard, D.

    1988-01-01

    The upper plenum of the internals of PWR, the steam generator bundle, the nuclear reactor core, may be schematically represented by a beam bundle immersed in a fluid. The dynamical study of such a system needs to take into account fluid structure interaction. A refined model at the scale of the tubes can be used but leads to a very difficult problem to solve even on the largest computers. The homogenization method allows to have an approximation of the fluid structure interaction for the global behaviour of the bundle. It consists of replacing the heterogeneous physical medium (tubes and fluid) by an equivalent homogeneous medium whose characteristics are determined from the resolution of a set of problems on the elementary cell. The aim of this paper is to present the main steps of the determination of this equivalent medium in the case of small displacements (acoustic behaviour of the fluid). Then an application to LMFBR core geometry has been realised, which shows the lowering effect on eigenfrequencies due to the fluid. Some comparisons with test results will be presented. 6 refs, 7 figs, 2 tabs

  18. Tidal Dissipation in a Homogeneous Spherical Body. 1. Methods

    Science.gov (United States)

    2014-11-01

    mantle (with χ = χlmpq ≡ |ωlmpq| being the physical forcing frequency). The dependency J̄ (χ ) follows from the rheological model . Evidently, the... current paper. Key words: planets and satellites: dynamical evolution and stability – planets and satellites: formation – planets and satellites: general... modeling the body with a homogeneous sphere of a certain rheology. However, the simplistic nature of the approach limits the precision of the ensuing

  19. Frequency-dependant homogenized properties of composite using spectral analysis method

    International Nuclear Information System (INIS)

    Ben Amor, M; Ben Ghozlen, M H; Lanceleur, P

    2010-01-01

    An inverse procedure is proposed to determine the material constants of multilayered composites using a spectral analysis homogenization method. Recursive process gives interfacial displacement perpendicular to layers in term of deepness. A fast-Fourier transform (FFT) procedure has been used in order to extract the wave numbers propagating in the multilayer. The upper frequency bound of this homogenization domain is estimated. Inside the homogenization domain, we discover a maximum of three planes waves susceptible to propagate in the medium. A consistent algorithm is adopted to develop an inverse procedure for the determination of the materials constants of multidirectional composite. The extracted wave numbers are used as the inputs for the procedure. The outputs are the elastic constants of multidirectional composite. Using this method, the frequency dependent effective elastic constants are obtained and example for [0/90] composites is given.

  20. Homogeneous SiGe crystal growth in microgravity by the travelling liquidus-zone method

    International Nuclear Information System (INIS)

    Kinoshita, K; Arai, Y; Inatomi, Y; Sakata, K; Takayanagi, M; Yoda, S; Miyata, H; Tanaka, R; Sone, T; Yoshikawa, J; Kihara, T; Shibayama, H; Kubota, Y; Shimaoka, T; Warashina, Y

    2011-01-01

    Homogeneous SiGe crystal growth experiments will be performed on board the ISS 'Kibo' using a gradient heating furnace (GHF). A new crystal growth method invented for growing homogeneous mixed crystals named 'travelling liquidus-zone (TLZ) method' is evaluated by the growth of Si 0.5 Ge 0.5 crystals in space. We have already succeeded in growing homogeneous 2mm diameter Si 0.5 Ge 0.5 crystals on the ground but large diameter homogeneous crystals are difficult to be grown due to convection in a melt. In microgravity, larger diameter crystals can be grown with suppressing convection. Radial concentration profiles as well as axial profiles in microgravity grown crystals will be measured and will be compared with our two-dimensional TLZ growth model equation and compositional variation is analyzed. Results are beneficial for growing large diameter mixed crystals by the TLZ method on the ground. Here, we report on the principle of the TLZ method for homogeneous crystal growth, results of preparatory experiments on the ground and plan for microgravity experiments.

  1. Neutron thermalization in absorbing infinite homogeneous media: theoretical methods; Methodes theoriques pour l'etude de la thermalisation des neutrons dans les milieux absorbants infinis et homogenes

    Energy Technology Data Exchange (ETDEWEB)

    Cadilhac, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-11-15

    After a general survey of the theory of neutron thermalization in homogeneous media, one introduces, through a proper formulation, a simplified model generalizing both the Horowitz model (generalized heavy free gas approximation) and the proton gas model. When this model is used, the calculation of spectra is reduced to the solution of linear second order differential equations. Since it depends on two arbitrary functions, the model gives a good approximation of any usual moderator for reactor physics purposes. The choice of these functions is discussed from a theoretical point of view; a method based on the consideration of the first two moments of the scattering law is investigated. Finally, the possibility of discriminating models by using experimental informations is considered. (author) [French] Apres un passage en revue de generalites sur la thermalisation des neutrons dans les milieux homogenes, on developpe un formalisme permettant de definir et d'etudier un modele simplifie de thermaliseur. Ce modele generalise l'approximation proposee par J. HOROWITZ (''gaz lourd generalise'') et comporte comme cas particulier le modele ''hydrogene gazeux monoatomique''. Il ramene le calcul des spectres a la resolution d'equations differentielles lineaires du second ordre. Il fait intervenir deux fonctions arbitraires, ce qui lui permet de representer les thermaliseurs usuels de facon satisfaisante pour les besoins de la physique des reacteurs. L'ajustement theorique de ces fonctions est discute; on etudie une methode basee sur la consideration des deux premiers moments de la loi de diffusion. On envisage enfin la possibilite de discriminer les modeles d'apres des renseignements d'origine experimentale. (auteur)

  2. At-tank Low-Activity Feed Homogeneity Analysis Verification

    International Nuclear Information System (INIS)

    DOUGLAS, J.G.

    2000-01-01

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements

  3. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    Science.gov (United States)

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  4. A review of the physics methods for advanced gas-cooled reactors

    International Nuclear Information System (INIS)

    Buckler, A.N.

    1982-01-01

    A review is given of steady-state reactor physics methods and associated codes used in AGR design and operation. These range from the basic lattice codes (ARGOSY, WIMS), through homogeneous-diffusion theory fuel management codes (ODYSSEUS, MOPSY) to a fully heterogeneous code (HET). The current state of development of the methods is discussed, together with illustrative examples of their application. (author)

  5. Numerical method for solution of transient, homogeneous, equilibrium, two-phase flows in one space dimension

    International Nuclear Information System (INIS)

    Shin, Y.W.; Wiedermann, A.H.

    1979-10-01

    A solution method is presented for transient, homogeneous, equilibrium, two-phase flows of a single-component fluid in one space dimension. The method combines a direct finite-difference procedure and the method of characteristics. The finite-difference procedure solves the interior points of the computing domain; the boundary information is provided by a separate procedure based on the characteristics theory. The solution procedure for boundary points requires information in addition to the physical boundary conditions. This additional information is obtained by a new procedure involving integration of characteristics in the hodograph plane. Sample problems involving various combinations of basic boundary types are calculated for two-phase water/steam mixtures and single-phase nitrogen gas, and compared with independent method-of-characteristics solutions using very fine characteristic mesh. In all cases, excellent agreement is demonstrated

  6. A homogenization method for ductile-brittle composite laminates at large deformations

    DEFF Research Database (Denmark)

    Poulios, Konstantinos; Niordson, Christian Frithiof

    2018-01-01

    -elastic behavior in the reinforcement as well as for the bending stiffness of the reinforcement layers. Additionally to previously proposed models, the present method includes Lemaitre type damage for the reinforcement, making it applicable to a wider range of engineering applications. The capability...... of the proposed method in representing the combined effect of plasticity, damage and buckling at microlevel within a homogenized setting is demonstrated by means of direct comparisons to a reference discrete model.......This paper presents a high fidelity homogenization method for periodically layered composite structures that accounts for plasticity in the matrix material and quasi-brittle damage in the reinforcing layers, combined with strong geometrical nonlinearities. A set of deliberately chosen internal...

  7. Multigrid Finite Element Method in Calculation of 3D Homogeneous and Composite Solids

    Directory of Open Access Journals (Sweden)

    A.D. Matveev

    2016-12-01

    Full Text Available In the present paper, a method of multigrid finite elements to calculate elastic three-dimensional homogeneous and composite solids under static loading has been suggested. The method has been developed based on the finite element method algorithms using homogeneous and composite three-dimensional multigrid finite elements (MFE. The procedures for construction of MFE of both rectangular parallelepiped and complex shapes have been shown. The advantages of MFE are that they take into account, following the rules of the microapproach, heterogeneous and microhomogeneous structures of the bodies, describe the three-dimensional stress-strain state (without any simplifying hypotheses in homogeneous and composite solids, as well as generate small dimensional discrete models and numerical solutions with a high accuracy.

  8. Unified treatment of microscopic boundary conditions and efficient algorithms for estimating tangent operators of the homogenized behavior in the computational homogenization method

    Science.gov (United States)

    Nguyen, Van-Dung; Wu, Ling; Noels, Ludovic

    2017-03-01

    This work provides a unified treatment of arbitrary kinds of microscopic boundary conditions usually considered in the multi-scale computational homogenization method for nonlinear multi-physics problems. An efficient procedure is developed to enforce the multi-point linear constraints arising from the microscopic boundary condition either by the direct constraint elimination or by the Lagrange multiplier elimination methods. The macroscopic tangent operators are computed in an efficient way from a multiple right hand sides linear system whose left hand side matrix is the stiffness matrix of the microscopic linearized system at the converged solution. The number of vectors at the right hand side is equal to the number of the macroscopic kinematic variables used to formulate the microscopic boundary condition. As the resolution of the microscopic linearized system often follows a direct factorization procedure, the computation of the macroscopic tangent operators is then performed using this factorized matrix at a reduced computational time.

  9. Effect of lipid viscosity and high-pressure homogenization on the physical stability of "Vitamin E" enriched emulsion.

    Science.gov (United States)

    Alayoubi, Alaadin; Abu-Fayyad, Ahmed; Rawas-Qalaji, Mutasem M; Sylvester, Paul W; Nazzal, Sami

    2015-01-01

    Recently there has been a growing interest in vitamin E for its potential use in cancer therapy. The objective of this work was therefore to formulate a physically stable parenteral lipid emulsion to deliver higher doses of vitamin E than commonly used in commercial products. Specifically, the objectives were to study the effects of homogenization pressure, number of homogenizing cycles, viscosity of the oil phase, and oil content on the physical stability of emulsions fortified with high doses of vitamin E (up to 20% by weight). This was done by the use of a 27-run, 4-factor, 3-level Box-Behnken statistical design. Viscosity, homogenization pressure, and number of cycles were found to have a significant effect on particle size, which ranged from 213 to 633 nm, and on the percentage of vitamin E remaining emulsified after storage, which ranged from 17 to 100%. Increasing oil content from 10 to 20% had insignificant effect on the responses. Based on the results it was concluded that stable vitamin E rich emulsions could be prepared by repeated homogenization at higher pressures and by lowering the viscosity of the oil phase, which could be adjusted by blending the viscous vitamin E with medium-chain triglycerides (MCT).

  10. Coarse mesh finite element method for boiling water reactor physics analysis

    International Nuclear Information System (INIS)

    Ellison, P.G.

    1983-01-01

    A coarse mesh method is formulated for the solution of Boiling Water Reactor physics problems using two group diffusion theory. No fuel assembly cross-section homogenization is required; water gaps, control blades and fuel pins of varying enrichments are treated explicitly. The method combines constrained finite element discretization with infinite lattice super cell trial functions to obtain coarse mesh solutions for which the only approximations are along the boundaries between fuel assemblies. The method is applied to bench mark Boiling Water Reactor problems to obtain both the eigenvalue and detailed flux distributions. The solutions to these problems indicate the method is useful in predicting detailed power distributions and eigenvalues for Boiling Water Reactor physics problems

  11. Lambda-Cyhalothrin Nanosuspension Prepared by the Melt Emulsification-High Pressure Homogenization Method

    OpenAIRE

    Pan, Zhenzhong; Cui, Bo; Zeng, Zhanghua; Feng, Lei; Liu, Guoqiang; Cui, Haixin; Pan, Hongyu

    2015-01-01

    The nanosuspension of 5% lambda-cyhalothrin with 0.2% surfactants was prepared by the melt emulsification-high pressure homogenization method. The surfactants composition, content, and homogenization process were optimized. The anionic surfactant (1-dodecanesulfonic acid sodium salt) and polymeric surfactant (maleic rosin-polyoxypropylene-polyoxyethylene ether sulfonate) screened from 12 types of commercially common-used surfactants were used to prepare lambda-cyhalothrin nanosuspension with ...

  12. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  13. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  14. Fat properties during homogenization, spray-drying, and storage affect the physical properties of dairy powders.

    Science.gov (United States)

    Vignolles, M L; Lopez, C; Madec, M N; Ehrhardt, J J; Méjean, S; Schuck, P; Jeantet, R

    2009-01-01

    Changes in fat properties were studied before, during, and after the drying process (including during storage) to determine the consequences on powder physical properties. Several methods were combined to characterize changes in fat structure and thermal properties as well as the physical properties of powders. Emulsion droplet size and droplet aggregation depended on the homogenizing pressures and were also affected by spray atomization. Aggregation was usually greater after spray atomization, resulting in greater viscosities. These processes did not have the same consequences on the stability of fat in the powders. The quantification of free fat is a pertinent indicator of fat instability in the powders. Confocal laser scanning microscopy permitted the characterization of the structure of fat in situ in the powders. Powders from unhomogenized emulsions showed greater free fat content. Surface fat was always overrepresented, regardless of the composition and process parameters. Differential scanning calorimetry melting experiments showed that fat was partially crystallized in situ in the powders stored at 20 degrees C, and that it was unstable on a molecular scale. Thermal profiles were also related to the supramolecular structure of fat in the powder particle matrix. Powder physical properties depended on both composition and process conditions. The free fat content seemed to have a greater influence than surface fat on powder physical properties, except for wettability. This study clearly showed that an understanding of fat behavior is essential for controlling and improving the physical properties of fat-filled dairy powders and their overall quality.

  15. Homogenization of Periodic Masonry Using Self-Consistent Scheme and Finite Element Method

    Science.gov (United States)

    Kumar, Nitin; Lambadi, Harish; Pandey, Manoj; Rajagopal, Amirtham

    2016-01-01

    Masonry is a heterogeneous anisotropic continuum, made up of the brick and mortar arranged in a periodic manner. Obtaining the effective elastic stiffness of the masonry structures has been a challenging task. In this study, the homogenization theory for periodic media is implemented in a very generic manner to derive the anisotropic global behavior of the masonry, through rigorous application of the homogenization theory in one step and through a full three-dimensional behavior. We have considered the periodic Eshelby self-consistent method and the finite element method. Two representative unit cells that represent the microstructure of the masonry wall exactly are considered for calibration and numerical application of the theory.

  16. A New and Simple Method for Crosstalk Estimation in Homogeneous Trench-Assisted Multi-Core Fibers

    DEFF Research Database (Denmark)

    Ye, Feihong; Tu, Jiajing; Saitoh, Kunimasa

    2014-01-01

    A new and simple method for inter-core crosstalk estimation in homogeneous trench-assisted multi-core fibers is presented. The crosstalk calculated by this method agrees well with experimental measurement data for two kinds of fabricated 12-core fibers.......A new and simple method for inter-core crosstalk estimation in homogeneous trench-assisted multi-core fibers is presented. The crosstalk calculated by this method agrees well with experimental measurement data for two kinds of fabricated 12-core fibers....

  17. Homogenized parameters of light water fuel elements computed by a perturbative (perturbation) method

    International Nuclear Information System (INIS)

    Koide, Maria da Conceicao Michiyo

    2000-01-01

    A new analytic formulation for material parameters homogenization of the two dimensional and two energy-groups diffusion model has been successfully used as a fast computational tool for recovering the detailed group fluxes in full reactor cores. The homogenization method which has been proposed does not require the solution of the diffusion problem by a numerical method. As it is generally recognized that currents at assembly boundaries must be computed accurately, a simple numerical procedure designed to improve the values of currents obtained by nodal calculations is also presented. (author)

  18. A new concept of equivalent homogenization method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Pogoskekyan, Leonid; Kim, Young Il; Ju, Hyung Kook; Chang, Moon Hee [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The offered concept covers both those of K. Koebke and K. Smith; both of them can be simulated within framework of new concept. Also, the offered concept covers Siemens KWU approach for baffle/reflector simulation, where the equivalent homogenized reflector XS are derived from the conservation of response matrix at the interface in 1D simi-infinite slab geometry. The IM and XS of new concept satisfy the same assumption about response matrix conservation in 1D semi-infinite slab geometry. It is expected that the new concept provides more accurate approximation of heterogeneous cell, especially in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are: improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith`s approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b) control blades simulation; (c) mixed UO{sub 2}/MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANDOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions. 9 figs., 7 refs. (Author).

  19. Toward whole-core neutron transport without spatial homogenization

    International Nuclear Information System (INIS)

    Lewis, E. E.

    2009-01-01

    Full text of publication follows: A long-term goal of computational reactor physics is the deterministic analysis of power reactor core neutronics without incurring significant discretization errors in the energy, spatial or angular variables. In principle, given large enough parallel configurations with unlimited CPU time and memory, this goal could be achieved using existing three-dimensional neutron transport codes. In practice, however, solving the Boltzmann equation for neutrons over the six-dimensional phase space is made intractable by the nature of neutron cross-sections and the complexity and size of power reactor cores. Tens of thousands of energy groups would be required for faithful cross section representation. Likewise, the numerous material interfaces present in power reactor lattices require exceedingly fine spatial mesh structures; these ubiquitous interfaces preclude effective implementation of adaptive grid, mesh-less methods and related techniques that have been applied so successfully in other areas of engineering science. These challenges notwithstanding, substantial progress continues in the pursuit for more robust deterministic methods for whole-core neutronics analysis. This paper examines the progress over roughly the last decade, emphasizing the space-angle variables and the quest to eliminate errors attributable to spatial homogenization. As prolog we briefly assess 1990's methods used in light water reactor analysis and review the lessons learned from the C5G7 benchmark exercises which were originated in 1999 to appraise the ability of transport codes to perform core calculations without homogenization. We proceed by examining progress over the last decade much of which falls into three areas. These may be broadly characterized as reduced homogenization, dynamic homogenization and planar-axial synthesis. In the first, homogenization in three-dimensional calculations is reduced from the fuel assembly to the pin-cell level. In the second

  20. Homogenization of Mammalian Cells.

    Science.gov (United States)

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  1. The extensions of space-time. Physics in the 8-dimensional homogeneous space D = SU(2,2)/K

    International Nuclear Information System (INIS)

    Barut, A.O.

    1993-07-01

    The Minkowski space-time is only a boundary of a bigger homogeneous space of the conformal group. The conformal group is the symmetry group of our most fundamental massless wave equations. These extended groups and spaces have many remarkable properties and physical implications. (author). 36 refs

  2. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    Science.gov (United States)

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  3. Mitoxantrone removal by electrochemical method: A comparison of homogenous and heterogenous catalytic reactions

    Directory of Open Access Journals (Sweden)

    Abbas Jafarizad

    2017-08-01

    Full Text Available Background: Mitoxantrone (MXT is a drug for cancer therapy and a hazardous pharmaceutical to the environment which must be removed from contaminated waste streams. In this work, the removal of MXT by the electro-Fenton process over heterogeneous and homogenous catalysts is reported. Methods: The effects of the operational conditions (reaction medium pH, catalyst concentration and utilized current intensity were studied. The applied electrodes were carbon cloth (CC without any processing (homogenous process, graphene oxide (GO coated carbon cloth (GO/CC (homogenous process and Fe3O4@GO nanocomposite coated carbon cloth (Fe3O4@GO/CC (heterogeneous process. The characteristic properties of the electrodes were determined by atomic force microscopy (AFM, field emission scanning electron microscopy (FE-SEM and cathode polarization. MXT concentrations were determined by using ultraviolet-visible (UV-Vis spectrophotometer. Results: In a homogenous reaction, the high concentration of Fe catalyst (>0.2 mM decreased the MXT degradation rate. The results showed that the Fe3O4@GO/CC electrode included the most contact surface. The optimum operational conditions were pH 3.0 and current intensity of 450 mA which resulted in the highest removal efficiency (96.9% over Fe3O4@GO/CC electrode in the heterogeneous process compared with the other two electrodes in a homogenous process. The kinetics of the MXT degradation was obtained as a pseudo-first order reaction. Conclusion: The results confirmed the high potential of the developed method to purify contaminated wastewaters by MXT.

  4. Description of comparison method for evaluating spatial or temporal homogeneity of environmental monitoring data

    International Nuclear Information System (INIS)

    Mecozzi, M.; Cicero, A.M.

    1995-01-01

    In this paper a comparison method to verify the homogeneity/inhomogeneity of environmental monitoring data is described. The comparison method is based on the simultaneous application of three statistical tests: One-Way ANOVA, Kruskal Wallis and One-Way IANOVA. Robust tests such as IANOVA and Kruskal Wallis can be more efficient than the usual ANOVA methods because are resistant against the presence of outliers and divergences from the normal distribution of the data. The evidences of the study are that the validation of the result about presence/absence of homogeneity in the data set is obtained when it is confirmed by two tests at least

  5. Cluster-cell calculation using the method of generalized homogenization

    International Nuclear Information System (INIS)

    Laletin, N.I.; Boyarinov, V.F.

    1988-01-01

    The generalized-homogenization method (GHM), used for solving the neutron transfer equation, was applied to calculating the neutron distribution in the cluster cell with a series of cylindrical cells with cylindrically coaxial zones. Single-group calculations of the technological channel of the cell of an RBMK reactor were performed using GHM. The technological channel was understood to be the reactor channel, comprised of the zirconium rod, the water or steam-water mixture, the uranium dioxide fuel element, and the zirconium tube, together with the adjacent graphite layer. Calculations were performed for channels with no internal sources and with unit incoming current at the external boundary as well as for channels with internal sources and zero current at the external boundary. The PRAKTINETs program was used to calculate the symmetric neutron distributions in the microcell and in channels with homogenized annular zones. The ORAR-TsM program was used to calculate the antisymmetric distribution in the microcell. The accuracy of the calculations were compared for the two channel versions

  6. Essential Mathematics for the Physical Sciences; Volume I: Homogeneous boundary value problems, Fourier methods, and special functions

    Science.gov (United States)

    Borden, Brett; Luscombe, James

    2017-10-01

    Physics is expressed in the language of mathematics; it is deeply ingrained in how physics is taught and how it's practiced. A study of the mathematics used in science is thus a sound intellectual investment for training as scientists and engineers. This first volume of two is centered on methods of solving partial differential equations and the special functions introduced. This text is based on a course offered at the Naval Postgraduate School (NPS) and while produced for NPS needs, it will serve other universities well.

  7. Formulae and Bounds connected to Optimal Design and Homogenization of Partial Differential Operators and Integral Functionals

    Energy Technology Data Exchange (ETDEWEB)

    Lukkassen, D.

    1996-12-31

    When partial differential equations are set up to model physical processes in strongly heterogeneous materials, effective parameters for heat transfer, electric conductivity etc. are usually required. Averaging methods often lead to convergence problems and in homogenization theory one is therefore led to study how certain integral functionals behave asymptotically. This mathematical doctoral thesis discusses (1) means and bounds connected to homogenization of integral functionals, (2) reiterated homogenization of integral functionals, (3) bounds and homogenization of some particular partial differential operators, (4) applications and further results. 154 refs., 11 figs., 8 tabs.

  8. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  9. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  10. Feasibility Study of Aseptic Homogenization: Affecting Homogenization Steps on Quality of Sterilized Coconut Milk

    Directory of Open Access Journals (Sweden)

    Phungamngoen Chanthima

    2016-01-01

    Full Text Available Coconut milk is one of the most important protein-rich food sources available today. Separation of an emulsion into an aqueous phase and cream phase is commonly occurred and this leads an unacceptably physical defect of either fresh or processed coconut milk. Since homogenization steps are known to affect the stability of coconut milk. This work was aimed to study the effect of homogenization steps on quality of coconut milk. The samples were subject to high speed homogenization in the range of 5000-15000 rpm under sterilize temperatures at 120-140 °C for 15 min. The result showed that emulsion stability increase with increasing speed of homogenization. The lower fat particles were generated and easy to disperse in continuous phase lead to high stability. On the other hand, the stability of coconut milk decreased, fat globule increased, L value decreased and b value increased when the high sterilization temperature was applied. Homogenization after heating led to higher stability than homogenization before heating due to the reduced particle size of coconut milk after aggregation during sterilization process. The results implied that homogenization after sterilization process might play an important role on the quality of the sterilized coconut milk.

  11. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  12. SuPer-Homogenization (SPH) Corrected Cross Section Generation for High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Ramazan Sonat [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hummel, Andrew John [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hiruta, Hikaru [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-03-01

    The deterministic full core simulators require homogenized group constants covering the operating and transient conditions over the entire lifetime. Traditionally, the homogenized group constants are generated using lattice physics code over an assembly or block in the case of prismatic high temperature reactors (HTR). For the case of strong absorbers that causes strong local depressions on the flux profile require special techniques during homogenization over a large volume. Fuel blocks with burnable poisons or control rod blocks are example of such cases. Over past several decades, there have been a tremendous number of studies performed for improving the accuracy of full-core calculations through the homogenization procedure. However, those studies were mostly performed for light water reactor (LWR) analyses, thus, may not be directly applicable to advanced thermal reactors such as HTRs. This report presents the application of SuPer-Homogenization correction method to a hypothetical HTR core.

  13. Resonance interference method in lattice physics code stream

    International Nuclear Information System (INIS)

    Choi, Sooyoung; Khassenov, Azamat; Lee, Deokjung

    2015-01-01

    Newly developed resonance interference model is implemented in the lattice physics code STREAM, and the model shows a significant improvement in computing accurate eigenvalues. Equivalence theory is widely used in production calculations to generate the effective multigroup (MG) cross-sections (XS) for commercial reactors. Although a lot of methods have been developed to enhance the accuracy in computing effective XSs, the current resonance treatment methods still do not have a clear resonance interference model. The conventional resonance interference model simply adds the absorption XSs of resonance isotopes to the background XS. However, the conventional models show non-negligible errors in computing effective XSs and eigenvalues. In this paper, a resonance interference factor (RIF) library method is proposed. This method interpolates the RIFs in a pre-generated RIF library and corrects the effective XS, rather than solving the time consuming slowing down calculation. The RIF library method is verified for homogeneous and heterogeneous problems. The verification results using the proposed method show significant improvements of accuracy in treating the interference effect. (author)

  14. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  15. Application of the Monte Carlo method in calculation of energy-time distribution from a pulsed photon source in homogeneous air environment

    International Nuclear Information System (INIS)

    Ilic, R.D.; Vojvodic, V.I.; Orlic, M.P.

    1981-01-01

    The stochastic nature of photon interactions with matter and the characteristics of photon transport through real materials, are very well suited for applications of the Monte Carlo method in calculations of the energy-space distribution of photons. Starting from general principles of the Monte Carlo method, physical-mathematical model of photon transport from a pulsed source is given for the homogeneous air environment. Based on that model, a computer program is written which is applied in calculations of scattered photons delay spectra and changes of the photon energy spectrum. Obtained results provide the estimation of the timespace function of the electromagnetic field generated by photon from a pulsed source. (author)

  16. Analytical solutions of time-fractional models for homogeneous Gardner equation and non-homogeneous differential equations

    Directory of Open Access Journals (Sweden)

    Olaniyi Samuel Iyiola

    2014-09-01

    Full Text Available In this paper, we obtain analytical solutions of homogeneous time-fractional Gardner equation and non-homogeneous time-fractional models (including Buck-master equation using q-Homotopy Analysis Method (q-HAM. Our work displays the elegant nature of the application of q-HAM not only to solve homogeneous non-linear fractional differential equations but also to solve the non-homogeneous fractional differential equations. The presence of the auxiliary parameter h helps in an effective way to obtain better approximation comparable to exact solutions. The fraction-factor in this method gives it an edge over other existing analytical methods for non-linear differential equations. Comparisons are made upon the existence of exact solutions to these models. The analysis shows that our analytical solutions converge very rapidly to the exact solutions.

  17. An infrared small target detection method based on multiscale local homogeneity measure

    Science.gov (United States)

    Nie, Jinyan; Qu, Shaocheng; Wei, Yantao; Zhang, Liming; Deng, Lizhen

    2018-05-01

    Infrared (IR) small target detection plays an important role in the field of image detection area owing to its intrinsic characteristics. This paper presents a multiscale local homogeneity measure (MLHM) for infrared small target detection, which can enhance the performance of IR small target detection system. Firstly, intra-patch homogeneity of the target itself and the inter-patch heterogeneity between target and the local background regions are integrated to enhance the significant of small target. Secondly, a multiscale measure based on local regions is proposed to obtain the most appropriate response. Finally, an adaptive threshold method is applied to small target segmentation. Experimental results on three different scenarios indicate that the MLHM has good performance under the interference of strong noise.

  18. Investigations into homogenization of electromagnetic metamaterials

    DEFF Research Database (Denmark)

    Clausen, Niels Christian Jerichau

    This dissertation encompasses homogenization methods, with a special interest into their applications to metamaterial homogenization. The first method studied is the Floquet-Bloch method, that is based on the assumption of a material being infinite periodic. Its field can then be expanded in term...

  19. The necessity of improvement for the current LWR fuel assembly homogenization method

    International Nuclear Information System (INIS)

    Tang Chuntao; Huang Hao; Zhang Shaohong

    2007-01-01

    When the modern LWR core analysis method is used to do core nuclear design and in-core fuel management calculation, how to accurately obtain the fuel assembly homogenized parameters is a crucial issue. In this paper, taking the NEA C5G7-MOX benchmark problem as a severe test problem, which involves low-enriched uranium assemblies interspersed with MOX assemblies, we have re-examined the applicability of the two major assumptions of the modern equivalence theory for fuel assembly homoge- nization, i.e. the isolated assembly spatial spectrum assumption and the condensed two- group representation assumption. Numerical results have demonstrated that for LWR cores with strong spectrum interaction, both of these two assumptions are no longer applicable and the improvement for the homogenization method is necessary, the current two-group representation should be improved by the multigroup representation and the current reflective assembly boundary condition should be improved by the 'real' assembly boundary condition. This is a research project supported by National Natural Science Foundation of China (10605016). (authors)

  20. Use of statistical methods for determining homogeneous layers of volcanic soils at a site in the slopes of Volcan Irazu, Cartago, Costa Rica

    International Nuclear Information System (INIS)

    Mora, Rolando

    2013-01-01

    A Statistical method was used to delineate homogeneous layers of volcanic soils in two sites where dynamic penetration soundings have been implemented. The study includes two perforations (DPL 1A and DPL 1B) with dynamic penetrometer light (DPL), carried out in the canton de La Union, Cartago. The data of the number of blows as a function od the depth of the DPL perforations depth were used to calculate the intraclass correlation coefficient (IR) and clearly determine the limits of homogeneous layers in volcanic soils. The physical and mechanical properties of each determined layer were established with the help of computers programs, as well as the variation according to depth of its allowable bearing capacity. With the obtained results is has been possible to determine the most suitable site to establish the foundation of a potable water storage tank [es

  1. Sample preparation methods for scanning electron microscopy of homogenized Al-Mg-Si billets: A comparative study

    International Nuclear Information System (INIS)

    Österreicher, Johannes Albert; Kumar, Manoj; Schiffl, Andreas; Schwarz, Sabine; Hillebrand, Daniel; Bourret, Gilles Remi

    2016-01-01

    Characterization of Mg-Si precipitates is crucial for optimizing the homogenization heat treatment of Al-Mg-Si alloys. Although sample preparation is key for high quality scanning electron microscopy imaging, most common methods lead to dealloying of Mg-Si precipitates. In this article we systematically evaluate different sample preparation methods: mechanical polishing, etching with various reagents, and electropolishing using different electrolytes. We demonstrate that the use of a nitric acid and methanol electrolyte for electropolishing a homogenized Al-Mg-Si alloy prevents the dissolution of Mg-Si precipitates, resulting in micrographs of higher quality. This preparation method is investigated in depth and the obtained scanning electron microscopy images are compared with transmission electron micrographs: the shape and size of Mg-Si precipitates appear very similar in either method. The scanning electron micrographs allow proper identification and measurement of the Mg-Si phases including needles with lengths of roughly 200 nm. These needles are β″ precipitates as confirmed by high resolution transmission electron microscopy. - Highlights: •Secondary precipitation in homogenized 6xxx Al alloys is crucial for extrudability. •Existing sample preparation methods for SEM are improvable. •Electropolishing with nitric acid/methanol yields superior quality in SEM. •The obtained micrographs are compared to TEM micrographs.

  2. Sample preparation methods for scanning electron microscopy of homogenized Al-Mg-Si billets: A comparative study

    Energy Technology Data Exchange (ETDEWEB)

    Österreicher, Johannes Albert; Kumar, Manoj [LKR Light Metals Technologies Ranshofen, Austrian Institute of Technology, Postfach 26, 5282 Ranshofen (Austria); Schiffl, Andreas [Hammerer Aluminium Industries Extrusion GmbH, Lamprechtshausener Straße 69, 5282 Ranshofen (Austria); Schwarz, Sabine [University Service Centre for Transmission Electron Microscopy, Vienna University of Technology, Wiedner Hauptstr. 8-10, 1040 Wien (Austria); Hillebrand, Daniel [Hammerer Aluminium Industries Extrusion GmbH, Lamprechtshausener Straße 69, 5282 Ranshofen (Austria); Bourret, Gilles Remi, E-mail: gilles.bourret@sbg.ac.at [Department of Materials Science and Physics, University of Salzburg, Hellbrunner Straße 34, 5020 Salzburg (Austria)

    2016-12-15

    Characterization of Mg-Si precipitates is crucial for optimizing the homogenization heat treatment of Al-Mg-Si alloys. Although sample preparation is key for high quality scanning electron microscopy imaging, most common methods lead to dealloying of Mg-Si precipitates. In this article we systematically evaluate different sample preparation methods: mechanical polishing, etching with various reagents, and electropolishing using different electrolytes. We demonstrate that the use of a nitric acid and methanol electrolyte for electropolishing a homogenized Al-Mg-Si alloy prevents the dissolution of Mg-Si precipitates, resulting in micrographs of higher quality. This preparation method is investigated in depth and the obtained scanning electron microscopy images are compared with transmission electron micrographs: the shape and size of Mg-Si precipitates appear very similar in either method. The scanning electron micrographs allow proper identification and measurement of the Mg-Si phases including needles with lengths of roughly 200 nm. These needles are β″ precipitates as confirmed by high resolution transmission electron microscopy. - Highlights: •Secondary precipitation in homogenized 6xxx Al alloys is crucial for extrudability. •Existing sample preparation methods for SEM are improvable. •Electropolishing with nitric acid/methanol yields superior quality in SEM. •The obtained micrographs are compared to TEM micrographs.

  3. Synthesis and Characterization of Anatase TiO_2 Powder using a Homogeneous Precipitation Method

    International Nuclear Information System (INIS)

    Choi, Soon Ok; Cho, Jee Hee; Lim, Sung Hwan; Chung, Eun Young

    2011-01-01

    This paper studies the experimental method that uses the homogeneous precipitation method to prepare mica flakes coated with anatase-type titania pearlescent pigment with urea as precipitant. The optimum technology parameters, the chemical composition, the microstructure, and the color property of resulting pigments are discussed. The coating principle of mica coated titania with various coating thickness is analyzed by X-ray diffraction (XRD), scanning electron microscopy (SEM), transmission electron microscopy(TEM) and tested by spectrophotometer analysis. The colored nanocrystalline pigments with different morphology and coating thickness 45-170 nm were prepared by homogeneous precipitation treatment of TiOSO_4(titanum oxysulfate) aqueous solutions. Characterizations on the pigments show that the pearlescent effects of the pigments depend mainly on mica size, thickness of the metal oxide deposit, its chemical composition, and crystal structure.

  4. Synergy between experimental and theoretical methods in the exploration of homogeneous transition metal catalysis

    DEFF Research Database (Denmark)

    Lupp, Daniel; Christensen, Niels Johan; Fristrup, Peter

    2014-01-01

    n this Perspective, we will focus on the use of both experimental and theoretical methods in the exploration of reaction mechanisms in homogeneous transition metal catalysis. We briefly introduce the use of Hammett studies and kinetic isotope effects (KIE). Both of these techniques can be complem......n this Perspective, we will focus on the use of both experimental and theoretical methods in the exploration of reaction mechanisms in homogeneous transition metal catalysis. We briefly introduce the use of Hammett studies and kinetic isotope effects (KIE). Both of these techniques can...... be complemented by computational chemistry – in particular in cases where interpretation of the experimental results is not straightforward. The good correspondence between experiment and theory is only possible due to recent advances within the applied theoretical framework. We therefore also highlight...

  5. Mechanized syringe homogenization of human and animal tissues.

    Science.gov (United States)

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  6. Electro-magnetostatic homogenization of bianisotropic metamaterials

    OpenAIRE

    Fietz, Chris

    2012-01-01

    We apply the method of asymptotic homogenization to metamaterials with microscopically bianisotropic inclusions to calculate a full set of constitutive parameters in the long wavelength limit. Two different implementations of electromagnetic asymptotic homogenization are presented. We test the homogenization procedure on two different metamaterial examples. Finally, the analytical solution for long wavelength homogenization of a one dimensional metamaterial with microscopically bi-isotropic i...

  7. Position-dependency of Fuel Pin Homogenization in a Pressurized Water Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Woong; Kim, Yonghee [Korea Advanced Institute of Science and Technolgy, Daejeon (Korea, Republic of)

    2016-05-15

    By considering the multi-physics effects more comprehensively, it is possible to acquire precise local parameters which can result in a more accurate core design and safety assessment. A conventional approach of the multi-physics neutronics calculation for the pressurized water reactor (PWR) is to apply nodal methods. Since the nodal methods are basically based on the use of assembly-wise homogenized parameters, additional pin power reconstruction processes are necessary to obtain local power information. In the past, pin-by-pin core calculation was impractical due to the limited computational hardware capability. With the rapid advancement of computer technology, it is now perhaps quite practical to perform the direct pin-by-pin core calculation. As such, fully heterogeneous transport solvers based on both stochastic and deterministic methods have been developed for the acquisition of exact local parameters. However, the 3-D transport reactor analysis is still challenging because of the very high computational requirement. Position-dependency of the fuel pin homogenized cross sections in a small PWR core has been quantified via comparison of infinite FA and 2-D whole core calculations with the use of high-fidelity MC simulations. It is found that the pin environmental affect is especially obvious in FAs bordering the baffle reflector regions. It is also noted that the downscattering cross section is rather sensitive to the spectrum changes of the pins. It is expected that the pinwise homogenized cross sections need to be corrected somehow for accurate pin-by-pin core calculations in the peripheral region of the reactor core.

  8. Simple method for the generation of multiple homogeneous field volumes inside the bore of superconducting magnets.

    Science.gov (United States)

    Chou, Ching-Yu; Ferrage, Fabien; Aubert, Guy; Sakellariou, Dimitris

    2015-07-17

    Standard Magnetic Resonance magnets produce a single homogeneous field volume, where the analysis is performed. Nonetheless, several modern applications could benefit from the generation of multiple homogeneous field volumes along the axis and inside the bore of the magnet. In this communication, we propose a straightforward method using a combination of ring structures of permanent magnets in order to cancel the gradient of the stray field in a series of distinct volumes. These concepts were demonstrated numerically on an experimentally measured magnetic field profile. We discuss advantages and limitations of our method and present the key steps required for an experimental validation.

  9. Control and homogenization of oxygen distribution in Si crystals by the novel technique: electromagnetic Czochralski method (EMCZ)

    Science.gov (United States)

    Watanabe, Masahito; Eguchi, Minoru; Hibiya, Taketoshi

    1999-07-01

    A novel method for control and homogenization oxygen distribution in silicon crystals by using electromagnetic force (EMF) to rotate the melt without crucible rotation has been developed. We call it electromagnetic Czochralski method. An EMF in the azimuthal direction is generated in the melt by the interaction between an electric current through the melt in the radial direction and a vertical magnetic field. (B). The rotation rate (ωm) of the silicon melt is continuously changed from 0 to over 105 rpm under I equals 0 to 8 A and B equals 0 to 0.1 T. Thirty-mm-diameter silicon single crystals free of dislocations could be grown under several conditions. The oxygen concentration in the crystals was continuously changed from 1 X 1017 to 1 X 1018 atoms/cm3 with increase of melt rotation by electromagnetic force. The homogeneous oxygen distributions in the radial directions were achieved. The continuous change of oxygen concentration and the homogenization of oxygen distribution along the radial direction are attributed to the control of the diffusion-boundary-layer at both the melt/crucible and crystal/melt by forced flow due to the EMF. This new method would be useful for growth of the large-diameter silicon crystals with a homogeneous distribution of oxygen.

  10. Comparison of Three Different Methods for Pile Integrity Testing on a Cylindrical Homogeneous Polyamide Specimen

    Science.gov (United States)

    Lugovtsova, Y. D.; Soldatov, A. I.

    2016-01-01

    Three different methods for pile integrity testing are proposed to compare on a cylindrical homogeneous polyamide specimen. The methods are low strain pile integrity testing, multichannel pile integrity testing and testing with a shaker system. Since the low strain pile integrity testing is well-established and standardized method, the results from it are used as a reference for other two methods.

  11. A non-asymptotic homogenization theory for periodic electromagnetic structures.

    Science.gov (United States)

    Tsukerman, Igor; Markel, Vadim A

    2014-08-08

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions.

  12. Fluid-structure interaction in tube bundles: homogenization methods, physical analysis

    International Nuclear Information System (INIS)

    Broc, D.; Sigrist, J.F.

    2009-01-01

    It is well known that the movements of a structure may be strongly influenced by fluid. This topic, called 'Fluid Structure Interaction' is important in many industrial applications. Tube bundles immersed in fluid are found in many cases, especially in nuclear industry: (core reactors, steam generators,...). The fluid leads to 'inertial effects' (with a decrease of the vibration frequencies) and 'dissipative effects' (with higher damping). The paper first presents the methods used for the simulation of the dynamic behaviour of tube bundles immersed in a fluid, with industrial examples. The methods used are based on the Euler equations for the fluid (perfect fluid), which allow to take into account the inertial effects. It is possible to take into account dissipative effects also, by using a Rayleigh damping. The conclusion focuses on improvements of the methods, in order to take into account with more accuracy the influence of the fluid, mainly the dissipative effects, which may be very important, especially in the case of a global fluid flow. (authors)

  13. Systematic assembly homogenization and local flux reconstruction for nodal method calculations of fast reactor power distributions

    International Nuclear Information System (INIS)

    Dorning, J.J.

    1991-01-01

    A simultaneous pin lattice cell and fuel bundle homogenization theory has been developed for use with nodal diffusion calculations of practical reactors. The theoretical development of the homogenization theory, which is based on multiple-scales asymptotic expansion methods carried out through fourth order in a small parameter, starts from the transport equation and systematically yields: a cell-homogenized bundled diffusion equation with self-consistent expressions for the cell-homogenized cross sections and diffusion tensor elements; and a bundle-homogenized global reactor diffusion equation with self-consistent expressions for the bundle-homogenized cross sections and diffusion tensor elements. The continuity of the angular flux at cell and bundle interfaces also systematically yields jump conditions for the scaler flux or so-called flux discontinuity factors on the cell and bundle interfaces in terms of the two adjacent cell or bundle eigenfunctions. The expressions required for the reconstruction of the angular flux or the 'de-homogenization' theory were obtained as an integral part of the development; hence the leading order transport theory angular flux is easily reconstructed throughout the reactor including the regions in the interior of the fuel bundles or computational nodes and in the interiors of the pin lattice cells. The theoretical development shows that the exact transport theory angular flux is obtained to first order from the whole-reactor nodal diffusion calculations, done using the homogenized nuclear data and discontinuity factors, is a product of three computed quantities: a ''cell shape function''; a ''bundle shape function''; and a ''global shape function''. 10 refs

  14. Homogenization of neutronic diffusion models

    International Nuclear Information System (INIS)

    Capdebosq, Y.

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  15. Probabilistic homogenization of random composite with ellipsoidal particle reinforcement by the iterative stochastic finite element method

    Science.gov (United States)

    Sokołowski, Damian; Kamiński, Marcin

    2018-01-01

    This study proposes a framework for determination of basic probabilistic characteristics of the orthotropic homogenized elastic properties of the periodic composite reinforced with ellipsoidal particles and a high stiffness contrast between the reinforcement and the matrix. Homogenization problem, solved by the Iterative Stochastic Finite Element Method (ISFEM) is implemented according to the stochastic perturbation, Monte Carlo simulation and semi-analytical techniques with the use of cubic Representative Volume Element (RVE) of this composite containing single particle. The given input Gaussian random variable is Young modulus of the matrix, while 3D homogenization scheme is based on numerical determination of the strain energy of the RVE under uniform unit stretches carried out in the FEM system ABAQUS. The entire series of several deterministic solutions with varying Young modulus of the matrix serves for the Weighted Least Squares Method (WLSM) recovery of polynomial response functions finally used in stochastic Taylor expansions inherent for the ISFEM. A numerical example consists of the High Density Polyurethane (HDPU) reinforced with the Carbon Black particle. It is numerically investigated (1) if the resulting homogenized characteristics are also Gaussian and (2) how the uncertainty in matrix Young modulus affects the effective stiffness tensor components and their PDF (Probability Density Function).

  16. Synthesis of CuO-NiO core-shell nanoparticles by homogeneous precipitation method

    International Nuclear Information System (INIS)

    Bayal, Nisha; Jeevanandam, P.

    2012-01-01

    Highlights: ► CuO-NiO core-shell nanoparticles have been synthesized using a simple homogeneous precipitation method for the first time. ► Mechanism of the formation of core-shell nanoparticles has been investigated. ► The synthesis route may be extended for the synthesis of other mixed metal oxide core-shell nanoparticles. - Abstract: Core-shell CuO–NiO mixed metal oxide nanoparticles in which CuO is the core and NiO is the shell have been successfully synthesized using homogeneous precipitation method. This is a simple synthetic method which produces first a layered double hydroxide precursor with core-shell morphology which on calcination at 350 °C yields the mixed metal oxide nanoparticles with the retention of core-shell morphology. The CuO–NiO mixed metal oxide precursor and the core-shell nanoparticles were characterized by powder X-ray diffraction, FT-IR spectroscopy, thermal gravimetric analysis, elemental analysis, scanning electron microscopy, transmission electron microscopy, and diffuse reflectance spectroscopy. The chemical reactivity of the core-shell nanoparticles was tested using catalytic reduction of 4-nitrophenol with NaBH 4 . The possible growth mechanism of the particles with core-shell morphology has also been investigated.

  17. Optimal design of a 7 T highly homogeneous superconducting magnet for a Penning trap

    International Nuclear Information System (INIS)

    Wu Wei; He Yuan; Ma Lizhen; Huang Wenxue; Xia Jiawen

    2010-01-01

    A Penning trap system called Lanzhou Penning Trap (LPT) is now being developed for precise mass measurements at the Institute of Modern Physics(IMP). One of the key components is a 7 T actively shielded superconducting magnet with a clear warm bore of 156 mm. The required field homogeneity is 3 x 10 -7 over two 1 cubic centimeter volumes lying 220 mm apart along the magnet axis. We introduce a two-step method which combines linear programming and a nonlinear optimization algorithm for designing the multi-section superconducting magnet. This method is fast and flexible for handling arbitrary shaped homogeneous volumes and coils. With the help of this method an optimal design for the LPT superconducting magnet has been obtained. (authors)

  18. Determination of perfluorinated compounds in fish fillet homogenates: Method validation and application to fillet homogenates from the Mississippi River

    International Nuclear Information System (INIS)

    Malinsky, Michelle Duval; Jacoby, Cliffton B.; Reagen, William K.

    2011-01-01

    We report herein a simple protein precipitation extraction-liquid chromatography tandem mass spectrometry (LC/MS/MS) method, validation, and application for the analysis of perfluorinated carboxylic acids (C7-C12), perfluorinated sulfonic acids (C4, C6, and C8), and perfluorooctane sulfonamide (FOSA) in fish fillet tissue. The method combines a rapid homogenization and protein precipitation tissue extraction procedure using stable-isotope internal standard (IS) calibration. Method validation in bluegill (Lepomis macrochirus) fillet tissue evaluated the following: (1) method accuracy and precision in both extracted matrix-matched calibration and solvent (unextracted) calibration, (2) quantitation of mixed branched and linear isomers of perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) with linear isomer calibration, (3) quantitation of low level (ppb) perfluorinated compounds (PFCs) in the presence of high level (ppm) PFOS, and (4) specificity from matrix interferences. Both calibration techniques produced method accuracy of at least 100 ± 13% with a precision (%RSD) ≤18% for all target analytes. Method accuracy and precision results for fillet samples from nine different fish species taken from the Mississippi River in 2008 and 2009 are also presented.

  19. Determination of perfluorinated compounds in fish fillet homogenates: Method validation and application to fillet homogenates from the Mississippi River

    Energy Technology Data Exchange (ETDEWEB)

    Malinsky, Michelle Duval, E-mail: mmalinsky@mmm.com [3M Environmental Laboratory, 3M Center, Building 0260-05-N-17, St. Paul, MN 55144-1000 (United States); Jacoby, Cliffton B.; Reagen, William K. [3M Environmental Laboratory, 3M Center, Building 0260-05-N-17, St. Paul, MN 55144-1000 (United States)

    2011-01-10

    We report herein a simple protein precipitation extraction-liquid chromatography tandem mass spectrometry (LC/MS/MS) method, validation, and application for the analysis of perfluorinated carboxylic acids (C7-C12), perfluorinated sulfonic acids (C4, C6, and C8), and perfluorooctane sulfonamide (FOSA) in fish fillet tissue. The method combines a rapid homogenization and protein precipitation tissue extraction procedure using stable-isotope internal standard (IS) calibration. Method validation in bluegill (Lepomis macrochirus) fillet tissue evaluated the following: (1) method accuracy and precision in both extracted matrix-matched calibration and solvent (unextracted) calibration, (2) quantitation of mixed branched and linear isomers of perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) with linear isomer calibration, (3) quantitation of low level (ppb) perfluorinated compounds (PFCs) in the presence of high level (ppm) PFOS, and (4) specificity from matrix interferences. Both calibration techniques produced method accuracy of at least 100 {+-} 13% with a precision (%RSD) {<=}18% for all target analytes. Method accuracy and precision results for fillet samples from nine different fish species taken from the Mississippi River in 2008 and 2009 are also presented.

  20. Title: a simple method to evaluate linac beam homogeneity

    International Nuclear Information System (INIS)

    Monti, A.F.; Ostinelli, A.; Gelosa, S.; Frigerio, M.

    1995-01-01

    Quality Control (QC) tests in Radiotherapy represent a basic requirement to asses treatment units performance and treatment quality. Since they are generally time consuming, it is worth while to introduce procedures and methods which can be carried on more easily and quickly. Since 1994 in the Radiotherapy Department of S. Anna Hospital, it had been employed a commercially available solid phantom (PRECITRON) with a 10 diodes array, to investigate beam homogeneity (symmetry and flatness). In particular, global symmetry percentage indexes were defined which consider pairs of corresponding points along each axis (x and y) and compare the readings of the respective diodes, following the formula: (I gs =((X d + X -d ) - (Y d + Y -d )((X d + X -d ) + (Y d + Y -d )*200 where X d and X -d are points 8 or 10 cm equally spaced from the beam centre along x axis and the same for Y d and Y -d along y axis. Even if non supporting international protocols requirements as a whole, this parameter gives an important information about beam homogeneity, when only few points of measure are available in a plane, and it can be daily determined, thus fulfilling the aim of lightning immediately each situation capable to compromise treatment accuracy and effectiveness. In this poster we report the results concerning this parameter for a linear accelerator (Varian Clinac 1800), since September 1994 to September 1995

  1. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  2. Some variance reduction methods for numerical stochastic homogenization.

    Science.gov (United States)

    Blanc, X; Le Bris, C; Legoll, F

    2016-04-28

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. © 2016 The Author(s).

  3. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    International Nuclear Information System (INIS)

    Baranyai, L.

    1983-01-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with 198 Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed. (orig.) [de

  4. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    Energy Technology Data Exchange (ETDEWEB)

    Baranyai, L

    1983-12-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with /sup 198/Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed.

  5. Characterization and evaluation in vivo of baicalin-nanocrystals prepared by an ultrasonic-homogenization-fluid bed drying method.

    Science.gov (United States)

    Shi-Ying, Jin; Jin, Han; Shi-Xiao, Jin; Qing-Yuan, Lv; Jin-Xia, Bai; Chen, Hong-Ge; Rui-Sheng, Li; Wei, Wu; Hai-Long, Yuan

    2014-01-01

    To improve the absorption and bioavailability of baicalin using a nanocrystal (or nanosuspension) drug delivery system. A tandem, ultrasonic-homogenization-fluid bed drying technology was applied to prepare baicalin-nanocrystal dried powders, and the physicochemical properties of baicalin-nanocrystals were characterized by scanning electron microscopy, photon correlation spectroscopy, powder X-ray diffraction, physical stability, and solubility experiments. Furthermore, in situ intestine single-pass perfusion experiments and pharmacokinetics in rats were performed to make a comparison between the microcrystals of baicalin and pure baicalin in their absorption properties and bioavailability in vivo. The mean particle size of baicalin-nanocrystals was 236 nm, with a polydispersity index of 0.173, and a zeta potential value of -34.8 mV, which provided a guarantee for the stability of the reconstituted nanosuspension. X-Ray diffraction results indicated that the crystallinity of baicalin was decreased through the ultrasonic-homogenization process. Physical stability experiments showed that the prepared baicalin-nanocrystals were sufficiently stable. It was shown that the solubility of baicalin in the form of nanocrystals, at 495 μg·mL(-1), was much higher than the baicalin-microcrystals and the physical mixture (135 and 86.4 μg·mL(-1), respectively). In situ intestine perfusion experiments demonstrated a clear advantage in the dissolution and absorption characteristics for baicalin-nanocrystals compared to the other formulations. In addition, after oral administration to rats, the particle size decrease from the micron to nanometer range exhibited much higher in vivo bioavailability (with the AUC(0-t) value of 206.96 ± 21.23 and 127.95 ± 14.41 mg·L(-1)·h(-1), respectively). The nanocrystal drug delivery system using an ultrasonic-homogenization-fluid bed drying process is able to improve the absorption and in vivo bioavailability of baicalin, compared with pure

  6. High-throughput method for optimum solubility screening for homogeneity and crystallization of proteins

    Science.gov (United States)

    Kim, Sung-Hou [Moraga, CA; Kim, Rosalind [Moraga, CA; Jancarik, Jamila [Walnut Creek, CA

    2012-01-31

    An optimum solubility screen in which a panel of buffers and many additives are provided in order to obtain the most homogeneous and monodisperse protein condition for protein crystallization. The present methods are useful for proteins that aggregate and cannot be concentrated prior to setting up crystallization screens. A high-throughput method using the hanging-drop method and vapor diffusion equilibrium and a panel of twenty-four buffers is further provided. Using the present methods, 14 poorly behaving proteins have been screened, resulting in 11 of the proteins having highly improved dynamic light scattering results allowing concentration of the proteins, and 9 were crystallized.

  7. Determination of perfluorinated compounds in fish fillet homogenates: method validation and application to fillet homogenates from the Mississippi River.

    Science.gov (United States)

    Malinsky, Michelle Duval; Jacoby, Cliffton B; Reagen, William K

    2011-01-10

    We report herein a simple protein precipitation extraction-liquid chromatography tandem mass spectrometry (LC/MS/MS) method, validation, and application for the analysis of perfluorinated carboxylic acids (C7-C12), perfluorinated sulfonic acids (C4, C6, and C8), and perfluorooctane sulfonamide (FOSA) in fish fillet tissue. The method combines a rapid homogenization and protein precipitation tissue extraction procedure using stable-isotope internal standard (IS) calibration. Method validation in bluegill (Lepomis macrochirus) fillet tissue evaluated the following: (1) method accuracy and precision in both extracted matrix-matched calibration and solvent (unextracted) calibration, (2) quantitation of mixed branched and linear isomers of perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) with linear isomer calibration, (3) quantitation of low level (ppb) perfluorinated compounds (PFCs) in the presence of high level (ppm) PFOS, and (4) specificity from matrix interferences. Both calibration techniques produced method accuracy of at least 100±13% with a precision (%RSD) ≤18% for all target analytes. Method accuracy and precision results for fillet samples from nine different fish species taken from the Mississippi River in 2008 and 2009 are also presented. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Two-dimensional arbitrarily shaped acoustic cloaks composed of homogeneous parts

    Science.gov (United States)

    Li, Qi; Vipperman, Jeffrey S.

    2017-10-01

    Acoustic cloaking is an important application of acoustic metamaterials. Although the topic has received much attention, there are a number of areas where contributions are needed. In this paper, a design method for producing acoustic cloaks with arbitrary shapes that are composed of homogeneous parts is presented. The cloak is divided into sections, each of which, in turn, is further divided into two parts, followed by the application of transformation acoustics to derive the required properties for cloaking. With the proposed mapping relations, the properties of each part of the cloak are anisotropic but homogeneous, which can be realized using two alternating layers of homogeneous and isotropic materials. A hexagonal and an irregular cloak are presented as design examples. The full wave simulations using COMSOL Multiphysics finite element software show that the cloaks function well at reducing reflections and shadows. The variation of the cloak properties is investigated as a function of three important geometric parameters used in the transformations. A balance can be found between cloaking performance and materials properties that are physically realizable.

  9. Cosmic homogeneity: a spectroscopic and model-independent measurement

    Science.gov (United States)

    Gonçalves, R. S.; Carvalho, G. C.; Bengaly, C. A. P., Jr.; Carvalho, J. C.; Bernui, A.; Alcaniz, J. S.; Maartens, R.

    2018-03-01

    Cosmology relies on the Cosmological Principle, i.e. the hypothesis that the Universe is homogeneous and isotropic on large scales. This implies in particular that the counts of galaxies should approach a homogeneous scaling with volume at sufficiently large scales. Testing homogeneity is crucial to obtain a correct interpretation of the physical assumptions underlying the current cosmic acceleration and structure formation of the Universe. In this letter, we use the Baryon Oscillation Spectroscopic Survey to make the first spectroscopic and model-independent measurements of the angular homogeneity scale θh. Applying four statistical estimators, we show that the angular distribution of galaxies in the range 0.46 < z < 0.62 is consistent with homogeneity at large scales, and that θh varies with redshift, indicating a smoother Universe in the past. These results are in agreement with the foundations of the standard cosmological paradigm.

  10. ASCOT-1, Thermohydraulics of Axisymmetric PWR Core with Homogeneous Flow During LOCA

    International Nuclear Information System (INIS)

    1978-01-01

    1 - Nature of the physical problem solved: ASCOT-1 is used to analyze the thermo-hydraulic behaviour in a PWR core during a loss-of-coolant accident. 2 - Method of solution: The core is assumed to be axisymmetric two-dimensional and the conservation laws are solved by the method of characteristics. For the temperature response of fuel in the annular regions into which the core is divided, the heat conduction equations are solved by an explicit method with averaged flow conditions. 3 - Restrictions on the complexity of the problem: Axisymmetric two-dimensional homogeneous flows

  11. Burn-up function of fuel management code for aqueous homogeneous reactors and its validation

    International Nuclear Information System (INIS)

    Wang Liangzi; Yao Dong; Wang Kan

    2011-01-01

    Fuel Management Code for Aqueous Homogeneous Reactors (FMCAHR) is developed based on the Monte Carlo transport method, to analyze the physics characteristics of aqueous homogeneous reactors. FMCAHR has the ability of doing resonance treatment, searching for critical rod heights, thermal hydraulic parameters calculation, radiolytic-gas bubbles' calculation and bum-up calculation. This paper introduces the theory model and scheme of its burn-up function, and then compares its calculation results with benchmarks and with DRAGON's burn-up results, which confirms its bum-up computing precision and its applicability in the bum-up calculation and analysis for aqueous solution reactors. (authors)

  12. Homogeneity characterisation of (U,Gd)O2 sintered pellets by X-ray diffraction powder analysis applying Rietveld method

    International Nuclear Information System (INIS)

    Leyva, Ana G.; Vega, Daniel R.; Trimarco, Veronica G.; Marchi, Daniel E.

    1999-01-01

    The (U,Gd)O 2 sintered pellets are fabricated by different methods. The homogeneity characterisation of Gd content seems to be necessary as a production control to qualify the process and the final product. The micrographic technique is the most common method used to analyse the homogeneity of these samples, this method requires time and expertise to obtain good results. In this paper, we propose an analysis of the X-ray diffraction powder patterns through the Rietveld method, in which the differences between the experimental data and the calculated from a crystalline structure model proposed are evaluated. This result allows to determine the cell parameters, that can be correlated with the Gd concentration, and the existence of other phases with different Gd ratio. (author)

  13. Hybrid design method for air-core solenoid with axial homogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Li; Lee, Sang Jin [Uiduk University, Gyeongju (Korea, Republic of); Choi, Suk Jin [Institute for Basic Science, Daejeon (Korea, Republic of)

    2016-03-15

    In this paper, a hybrid method is proposed to design an air-core superconducting solenoid system for 6 T axial uniform magnetic field using Niobium Titanium (NbTi) superconducting wire. In order to minimize the volume of conductor, the hybrid optimization method including a linear programming and a nonlinear programming was adopted. The feasible space of solenoid is divided by several grids and the magnetic field at target point is approximated by the sum of magnetic field generated by an ideal current loop at the center of each grid. Using the linear programming, a global optimal current distribution in the feasible space can be indicated by non-zero current grids. Furthermore the clusters of the non-zero current grids also give the information of probable solenoids in the feasible space, such as the number, the shape, and so on. Applying these probable solenoids as the initial model, the final practical configuration of solenoids with integer layers can be obtained by the nonlinear programming. The design result illustrates the efficiency and the flexibility of the hybrid method. And this method can also be used for the magnet design which is required the high homogeneity within several ppm (parts per million)

  14. Method of the characteristics for calculation of VVER without homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Suslov, I.R.; Komlev, O.G.; Novikova, N.N.; Zemskov, E.A.; Tormyshev, I.V.; Melnikov, K.G.; Sidorov, E.B. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    2005-07-01

    The first stage of the development of characteristics code MCCG3D for calculation of the VVER-type reactor without homogenization is presented. The parallel version of the code for MPI was developed and tested on cluster PC with LINUX-OS. Further development of the MCCG3D code for design-level calculations with full-scale space-distributed feedbacks is discussed. For validation of the MCCG3D code we use the critical assembly VENUS-2. The geometrical models with and without homogenization have been used. With both models the MCCG3D results agree well with the experimental power distribution and with results generated by the other codes, but model without homogenization provides better results. The perturbation theory for MCCG3D code is developed and implemented in the module KEFSFGG. The calculations with KEFSFGG are in good agreement with direct calculations. (authors)

  15. FEMB, 2-D Homogeneous Neutron Diffusion in X-Y Geometry with Keff Calculation, Dyadic Fission Matrix

    International Nuclear Information System (INIS)

    Misfeldt, I.B.

    1987-01-01

    1 - Nature of physical problem solved: The two-dimensional neutron diffusion equation (xy geometry) is solved in the homogeneous form (K eff calculation). The boundary conditions specify each group current as a linear homogeneous function of the group fluxes (gamma matrix concept). For each material, the fission matrix is assumed to be dyadic. 2 - Method of solution: Finite element formulation with Lagrange type elements. Solution technique: SOR with extrapolation. 3 - Restrictions on the complexity of the problem: Maximum order of the Lagrange elements is 6

  16. A generalized model for homogenized reflectors

    International Nuclear Information System (INIS)

    Pogosbekyan, Leonid; Kim, Yeong Il; Kim, Young Jin; Joo, Hyung Kook

    1996-01-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The method of K. Smith can be simulated within framework of new method, while the new method approximates hetero-geneous cell better in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are:improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith's approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b)control blades simulation; (c) mixed UO 2 /MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANBOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions

  17. A literature review on biotic homogenization

    OpenAIRE

    Guangmei Wang; Jingcheng Yang; Chuangdao Jiang; Hongtao Zhao; Zhidong Zhang

    2009-01-01

    Biotic homogenization is the process whereby the genetic, taxonomic and functional similarity of two or more biotas increases over time. As a new research agenda for conservation biogeography, biotic homogenization has become a rapidly emerging topic of interest in ecology and evolution over the past decade. However, research on this topic is rare in China. Herein, we introduce the development of the concept of biotic homogenization, and then discuss methods to quantify its three components (...

  18. TVEDIM, 2-D Homogeneous and Inhomogeneous Neutron Diffusion for X-Y, R-Z, R-Theta Geometry

    International Nuclear Information System (INIS)

    Kristiansen, G.K.

    1987-01-01

    1 - Nature of physical problem solved: The two-dimensional neutron diffusion equation (x-y, r-z, or r-theta geometry is solved, either in the inhomogeneous (source calculation) or the homogeneous form (K eff calculation or absorber adjustment). The boundary conditions specify each group current as a linear homogeneous function of the group fluxes (gamma matrix concept). For each material, the fission matrix is assumed to by dyadic. 2 - Method of solution: Finite difference formulation (5 point scheme, mesh corner variant) is used. Solution technique: multi-line SOR. Eigenvalue estimate by neutron balance

  19. Commensurability effects in holographic homogeneous lattices

    International Nuclear Information System (INIS)

    Andrade, Tomas; Krikun, Alexander

    2016-01-01

    An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as “homogeneous holographic lattices.' Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities. However, it is not clear whether they are able to capture other lattice effects which are of interest in condensed matter. In this paper we investigate this question focusing our attention on the phenomenon of commensurability, which arises when the lattice scale is tuned to be equal to (an integer multiple of) another momentum scale in the system. We do so by studying the formation of spatially modulated phases in various models of homogeneous holographic lattices. Our results indicate that the onset of the instability is controlled by the near horizon geometry, which for insulating solutions does carry information about the lattice. However, we observe no sharp connection between the characteristic momentum of the broken phase and the lattice pitch, which calls into question the applicability of these models to the physics of commensurability.

  20. Iterative and variational homogenization methods for filled elastomers

    Science.gov (United States)

    Goudarzi, Taha

    Elastomeric composites have increasingly proved invaluable in commercial technological applications due to their unique mechanical properties, especially their ability to undergo large reversible deformation in response to a variety of stimuli (e.g., mechanical forces, electric and magnetic fields, changes in temperature). Modern advances in organic materials science have revealed that elastomeric composites hold also tremendous potential to enable new high-end technologies, especially as the next generation of sensors and actuators featured by their low cost together with their biocompatibility, and processability into arbitrary shapes. This potential calls for an in-depth investigation of the macroscopic mechanical/physical behavior of elastomeric composites directly in terms of their microscopic behavior with the objective of creating the knowledge base needed to guide their bottom-up design. The purpose of this thesis is to generate a mathematical framework to describe, explain, and predict the macroscopic nonlinear elastic behavior of filled elastomers, arguably the most prominent class of elastomeric composites, directly in terms of the behavior of their constituents --- i.e., the elastomeric matrix and the filler particles --- and their microstructure --- i.e., the content, size, shape, and spatial distribution of the filler particles. This will be accomplished via a combination of novel iterative and variational homogenization techniques capable of accounting for interphasial phenomena and finite deformations. Exact and approximate analytical solutions for the fundamental nonlinear elastic response of dilute suspensions of rigid spherical particles (either firmly bonded or bonded through finite size interphases) in Gaussian rubber are first generated. These results are in turn utilized to construct approximate solutions for the nonlinear elastic response of non-Gaussian elastomers filled with a random distribution of rigid particles (again, either firmly

  1. Spinor structures on homogeneous spaces

    International Nuclear Information System (INIS)

    Lyakhovskii, V.D.; Mudrov, A.I.

    1993-01-01

    For multidimensional models of the interaction of elementary particles, the problem of constructing and classifying spinor fields on homogeneous spaces is exceptionally important. An algebraic criterion for the existence of spinor structures on homogeneous spaces used in multidimensional models is developed. A method of explicit construction of spinor structures is proposed, and its effectiveness is demonstrated in examples. The results are of particular importance for harmonic decomposition of spinor fields

  2. Benchmarking homogenization algorithms for monthly data

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  3. Diamond-shaped electromagnetic transparent devices with homogeneous material parameters

    International Nuclear Information System (INIS)

    Li Tinghua; Huang Ming; Yang Jingjing; Yu Jiang; Lan Yaozhong

    2011-01-01

    Based on the linear coordinate transformation method, two-dimensional and three-dimensional electromagnetic transparent devices with diamond shape composed of homogeneous and non-singular materials are proposed in this paper. The permittivity and permeability tensors of the transparent devices are derived. The performance and scattering properties of the transparent devices are confirmed by a full-wave simulation. It can physically protect electric devices such as an antenna and a radar station inside, without sacrificing their performance. This work represents important progress towards the practical realization of metamaterial-assisted transparent devices and expands the application of transformation optics.

  4. Methods of experimental physics

    CERN Document Server

    Williams, Dudley

    1962-01-01

    Methods of Experimental Physics, Volume 3: Molecular Physics focuses on molecular theory, spectroscopy, resonance, molecular beams, and electric and thermodynamic properties. The manuscript first considers the origins of molecular theory, molecular physics, and molecular spectroscopy, as well as microwave spectroscopy, electronic spectra, and Raman effect. The text then ponders on diffraction methods of molecular structure determination and resonance studies. Topics include techniques of electron, neutron, and x-ray diffraction and nuclear magnetic, nuclear quadropole, and electron spin reson

  5. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  6. Matrix-dependent multigrid-homogenization for diffusion problems

    Energy Technology Data Exchange (ETDEWEB)

    Knapek, S. [Institut fuer Informatik tu Muenchen (Germany)

    1996-12-31

    We present a method to approximately determine the effective diffusion coefficient on the coarse scale level of problems with strongly varying or discontinuous diffusion coefficients. It is based on techniques used also in multigrid, like Dendy`s matrix-dependent prolongations and the construction of coarse grid operators by means of the Galerkin approximation. In numerical experiments, we compare our multigrid-homogenization method with homogenization, renormalization and averaging approaches.

  7. A Riccati Based Homogeneous and Self-Dual Interior-Point Method for Linear Economic Model Predictive Control

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Frison, Gianluca; Edlund, Kristian

    2013-01-01

    In this paper, we develop an efficient interior-point method (IPM) for the linear programs arising in economic model predictive control of linear systems. The novelty of our algorithm is that it combines a homogeneous and self-dual model, and a specialized Riccati iteration procedure. We test...

  8. A method to improve the B0 homogeneity of the heart in vivo.

    Science.gov (United States)

    Jaffer, F A; Wen, H; Balaban, R S; Wolff, S D

    1996-09-01

    A homogeneous static (B0) magnetic field is required for many NMR experiments such as echo planar imaging, localized spectroscopy, and spiral scan imaging. Although semi-automated techniques have been described to improve the B0 field homogeneity, none has been applied to the in vivo heart. The acquisition of cardiac field maps is complicated by motion, blood flow, and chemical shift artifact from epicardial fat. To overcome these problems, an ungated three-dimensional (3D) chemical shift image (CSI) was collected to generate a time and motion-averaged B0 field map. B0 heterogeneity in the heart was minimized by using a previous algorithm that solves for the optimal shim coil currents for an input field map, using up to third-order current-bounded shims (1). The method improved the B0 homogenelty of the heart in all 11 normal volunteers studied. After application of the algorithm to the unshimmed cardiac field maps, the standard deviation of proton frequency decreased by 43%, the magnitude 1H spectral linewidth decreased by 24%, and the peak-peak gradient decreased by 35%. Simulations of the high-order (second- and third-order) shims in B0 field correction of the heart show that high order shims are important, resulting for nearly half of the improvement in homogeneity for several subjects. The T2* of the left ventricular anterior wall before and after field correction was determined at 4.0 Tesis. Finally, results show that cardiac shimming is of benefit in cardiac 31P NMR spectroscopy and cardiac echo planar imaging.

  9. Synthesis of homogeneous Ca0.5Sr0.5FeO2.5+δ compound using a mirror furnace method

    International Nuclear Information System (INIS)

    Mahboub, M.S.; Zeroual, S.; Boudjada, A.

    2012-01-01

    Graphical abstract: X-ray diffraction pattern indexing of Ca 0.5 Sr 0.5 FeO 2.5+δ powder sample obtained by mirror furnace method after thermal treatment. Highlights: ► A homogenous compound Ca 0.5 Sr 0.5 FeO 2.5+δ has been synthesized for the first time by a mirror furnace method. ► Ca 0.5 Sr 0.5 FeO 2.5+δ powder sample is perfectly homogenous, confirmed by X-ray diffraction, Raman spectroscopy and EDS technique. ► The thermal treatment of Ca 0.5 Sr 0.5 FeO 2.5+δ powder sample can increase their average grain sizes. -- Abstract: A new synthesis method using melting zone technique via the double mirror furnace around 1600 °C is used to obtain homogenous brownmillerite compounds Ca 1−x Sr x FeO 2.5+δ in the range 0.3 ≤ x ≤ 0.7. These compounds play important role in understanding the mystery of the oxygen diffusion in the perovskite-related oxides. We have successfully solved the miscibility gap problem by synthesizing a good quality of homogenous powder samples of Ca 0.5 Sr 0.5 FeO 2.5+δ compound. Our result was confirmed by X-rays diffraction, Raman spectroscopy and energy dispersive spectroscopy analysis. Thermal treatment was also applied until 800 °C under vacuum to confirm again the homogeneity of powder samples, improve its quality and show that no decomposition or return to form Ca- and Sr-enriched microdomains takes place as a result of phase separation.

  10. A Review of the Scattering-Parameter Extraction Method with Clarification of Ambiguity Issues in Relation to Metamaterial Homogenization

    DEFF Research Database (Denmark)

    Arslanagic, Samel; Hansen, Troels Vejle; Mortensen, N. Asger

    2013-01-01

    The scattering-parameter extraction method of metamaterial homogenization is reviewed to show that the only ambiguity is that related to the choice of the branch of the complex logarithmic function (or the complex inverse cosine function). It is shown that the method has no ambiguity for the sign...

  11. From creep damage mechanics to homogenization methods a liber amicorum to celebrate the birthday of Nobutada Ohno

    CERN Document Server

    Matsuda, Tetsuya; Okumura, Dai

    2015-01-01

    This volume presents a collection of contributions on materials modeling, which were written to celebrate the 65th birthday of Prof. Nobutada Ohno. The book follows Prof. Ohno’s scientific topics, starting with creep damage problems and ending with homogenization methods.

  12. The relationship between continuum homogeneity and statistical homogeneity in cosmology

    International Nuclear Information System (INIS)

    Stoeger, W.R.; Ellis, G.F.R.; Hellaby, C.

    1987-01-01

    Although the standard Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe models are based on the concept that the Universe is spatially homogeneous, up to the present time no definition of this concept has been proposed that could in principle be tested by observation. Such a definition is here proposed, based on a simple spatial averaging procedure, which relates observable properties of the Universe to the continuum homogeneity idea that underlies the FLRW models. It turns out that the statistical homogeneity often used to describe the distribution of matter on a large scale does not imply spatial homogeneity according to this definition, and so cannot be simply related to a FLRW Universe model. Values are proposed for the homogeneity parameter and length scale of homogeneity of the Universe. (author)

  13. Colloidal properties of sodium caseinate-stabilized nanoemulsions prepared by a combination of a high-energy homogenization and evaporative ripening methods.

    Science.gov (United States)

    Montes de Oca-Ávalos, J M; Candal, R J; Herrera, M L

    2017-10-01

    Nanoemulsions stabilized by sodium caseinate (NaCas) were prepared using a combination of a high-energy homogenization and evaporative ripening methods. The effects of protein concentration and sucrose addition on physical properties were analyzed by dynamic light scattering (DLS), Turbiscan analysis, confocal laser scanning microscopy (CLSM) and small angle X-ray scattering (SAXS). Droplets sizes were smaller (~100nm in diameter) than the ones obtained by other methods (200 to 2000nm in diameter). The stability behavior was also different. These emulsions were not destabilized by creaming. As droplets were so small, gravitational forces were negligible. On the contrary, when they showed destabilization the main mechanism was flocculation. Stability of nanoemulsions increased with increasing protein concentrations. Nanoemulsions with 3 or 4wt% NaCas were slightly turbid systems that remained stable for at least two months. According to SAXS and Turbiscan results, aggregates remained in the nano range showing small tendency to aggregation. In those systems, interactive forces were weak due to the small diameter of flocs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. On the use, by Einstein, of the principle of dimensional homogeneity, in three problems of the physics of solids

    Directory of Open Access Journals (Sweden)

    FERNANDO L. LOBO B. CARNEIRO

    2000-12-01

    Full Text Available Einstein, in 1911, published an article on the application of the principle of dimensional homogeneity to three problems of the physics of solids: the characteristic frequency of the atomic nets of crystalline solids as a function of their moduli of compressibility or of their melting points, and the thermal conductivity of crystalline insulators. Recognizing that the physical dimensions of temperature are not the same as those of energy and heat, Einstein had recourse to the artifice of replace that physical parameter by its product by the Boltzmann constant, so obtaining correct results. But nowadays, with the new basic quantities "Thermodynamic Temperature theta (unit- Kelvin'', "Electric Current I (unit Ampère'' and "Amount of Substance MOL (unit-mole'', incorporated to the SI International System of Units, in 1960 and 1971, the same results are obtained in a more direct and coherent way. At the time of Einstein's article only three basic physical quantities were considered - length L, mass M, and time T. He ignored the pi theorem of dimensional analysis diffused by Buckingham three years later, and obtained the "pi numbers'' by trial and error. In the present paper is presented a revisitation of the article of Einstein, conducted by the modern methodology of dimensional analysis and theory of physical similitude.

  15. Environment-based pin-power reconstruction method for homogeneous core calculations

    International Nuclear Information System (INIS)

    Leroyer, H.; Brosselard, C.; Girardi, E.

    2012-01-01

    Core calculation schemes are usually based on a classical two-step approach associated with assembly and core calculations. During the first step, infinite lattice assemblies calculations relying on a fundamental mode approach are used to generate cross-sections libraries for PWRs core calculations. This fundamental mode hypothesis may be questioned when dealing with loading patterns involving several types of assemblies (UOX, MOX), burnable poisons, control rods and burn-up gradients. This paper proposes a calculation method able to take into account the heterogeneous environment of the assemblies when using homogeneous core calculations and an appropriate pin-power reconstruction. This methodology is applied to MOX assemblies, computed within an environment of UOX assemblies. The new environment-based pin-power reconstruction is then used on various clusters of 3x3 assemblies showing burn-up gradients and UOX/MOX interfaces, and compared to reference calculations performed with APOLLO-2. The results show that UOX/MOX interfaces are much better calculated with the environment-based calculation scheme when compared to the usual pin-power reconstruction method. The power peak is always better located and calculated with the environment-based pin-power reconstruction method on every cluster configuration studied. This study shows that taking into account the environment in transport calculations can significantly improve the pin-power reconstruction so far as it is consistent with the core loading pattern. (authors)

  16. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  17. Cellular uptake of beta-carotene from protein stabilized solid lipid nano-particles prepared by homogenization-evaporation method

    Science.gov (United States)

    Using a homogenization-evaporation method, beta-carotene (BC) loaded nano-particles were prepared with different ratios of food-grade sodium caseinate (SC), whey protein isolate (WPI), or soy protein isolate (SPI) to BC and evaluated for their physiochemical stability, in vitro cytotoxicity, and cel...

  18. Preparation and Optimization of 10-Hydroxycamptothecin Nanocolloidal Particles Using Antisolvent Method Combined with High Pressure Homogenization

    Directory of Open Access Journals (Sweden)

    Bolin Lian

    2017-01-01

    Full Text Available The aim of this study was to prepare 10-hydroxycamptothecin nanocolloidal particles (HCPTNPs to increase the solubility of drugs, reduce the toxicity, improve the stability of the drug, and so forth. HCPTNPs was prepared by antisolvent precipitation (AP method combined with high pressure homogenization (HPH, followed by lyophilization. The main parameters during antisolvent process including volume ratio of dimethyl sulfoxide (DMSO and H2O and dripping speed were optimized and their effects on mean particle size (MPS and yield of HCPT primary particles were investigated. In the high pressure homogeneous procedure, types of surfactants, amount of surfactants, and homogenization pressure (HP were optimized and their influences on MPS, zeta potential (ZP, and morphology were analyzed. The optimum conditions of HCPTNPs were as follows: 0.2 mg/mL HCPT aqueous suspension, 1% of ASS, 1000 bar of HP, and 20 passes. Finally, the HCPTNPs via lyophilization using glucose as lyoprotectant under optimum conditions had an MPS of 179.6 nm and a ZP of 28.79 ± 1.97 mV. The short-term stability of HCPTNPs indicated that the MPS changed in a small range.

  19. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  20. Precipitation-lyophilization-homogenization (PLH) for preparation of clarithromycin nanocrystals: influencing factors on physicochemical properties and stability.

    Science.gov (United States)

    Morakul, Boontida; Suksiriworapong, Jiraphong; Leanpolchareanchai, Jiraporn; Junyaprasert, Varaporn Buraphacheep

    2013-11-30

    Nanocrystals is one of effective technologies used to improve solubility and dissolution behavior of poorly soluble drugs. Clarithromycin is classified in BCS class II having low bioavailability due to very low dissolution behavior. The main purpose of this study was to investigate an efficiency of clarithromycin nanocrystals preparation by precipitation-lyophilization-homogenization (PLH) combination method in comparison with high pressure homogenization (HPH) method. The factors influencing particle size reduction and physical stability were assessed. The results showed that the PLH technique provided an effective and rapid reduction of particle size of nanocrystals to 460 ± 10 nm with homogeneity size distribution after only the fifth cycle of homogenization, whereas the same size was attained after 30 cycles by the HPH method. The smallest nanocrystals were achieved by using the combination of poloxamer 407 (2%, w/v) and SLS (0.1%, w/v) as stabilizers. This combination could prevent the particle aggregation over 3-month storage at 4 °C. The results from SEM showed that the clarithromycin nanocrystals were in cubic-shaped similar to its initial particle morphology. The DSC thermogram and X-ray diffraction pattern of nanocrystals were not different from the original drug except for intensity of peaks which indicated the presenting of nanocrystals in the crystalline state and/or partial amorphous form. In addition, the dissolution of the clarithromycin nanocrystals was dramatically increased as compared to the coarse clarithromycin. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Homogeneous Nucleation Rate Measurements in Supersaturated Water Vapor

    Czech Academy of Sciences Publication Activity Database

    Brus, David; Ždímal, Vladimír; Smolík, Jiří

    2008-01-01

    Roč. 129, č. 17 (2008), , 174501-1-174501-8 ISSN 0021-9606 R&D Projects: GA ČR GA101/05/2214 Institutional research plan: CEZ:AV0Z40720504 Keywords : homogeneous nucleation * water * diffusion chamber Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.149, year: 2008

  2. Analysis of the premeability characteristics along rough-walled fractures using a homogenization method

    International Nuclear Information System (INIS)

    Chae, Byung Gon; Choi, Jung Hae; Ichikawa, Yasuaki; Seo, Yong Seok

    2012-01-01

    To compute a permeability coefficient along a rough fracture that takes into account the fracture geometry, this study performed detailed measurements of fracture roughness using a confocal laser scanning microscope, a quantitative analysis of roughness using a spectral analysis, and a homogenization analysis to calculate the permeability coefficient on the microand macro-scale. The homogenization analysis is a type of perturbation theory that characterizes the behavior of microscopically inhomogeneous material with a periodic boundary condition in the microstructure. Therefore, it is possible to analyze accurate permeability characteristics that are represented by the local effect of the fracture geometry. The Cpermeability coefficients that are calculated using the homogenization analysis for each rough fracture model exhibit an irregular distribution and do not follow the relationship of the cubic law. This distribution suggests that the permeability characteristics strongly depend on the geometric conditions of the fractures, such as the roughness and the aperture variation. The homogenization analysis may allow us to produce more accurate results than are possible with the preexisting equations for calculating permeability.

  3. A Homogeneous Time-Resolved Fluorescence Immunoassay Method for the Measurement of Compound W.

    Science.gov (United States)

    Huang, Biao; Yu, Huixin; Bao, Jiandong; Zhang, Manda; Green, William L; Wu, Sing-Yung

    2018-01-01

    Using compound W (a 3,3'-diiodothyronine sulfate [T 2 S] immuno-crossreactive material)-specific polyclonal antibodies and homogeneous time-resolved fluorescence immunoassay assay techniques (AlphaLISA) to establish an indirect competitive compound W (ICW) quantitative detection method. Photosensitive particles (donor beads) coated with compound W or T 2 S and rabbit anti-W antibody were incubated with biotinylated goat anti-rabbit antibody. This constitutes a detection system with streptavidin-coated acceptor particle. We have optimized the test conditions and evaluated the detection performance. The sensitivity of the method was 5 pg/mL, and the detection range was 5 to 10 000 pg/mL. The intra-assay coefficient of variation averages W levels in extracts of maternal serum samples. This may have clinical application to screen congenital hypothyroidism in utero.

  4. An Improved Surface Simplification Method for Facial Expression Animation Based on Homogeneous Coordinate Transformation Matrix and Maximum Shape Operator

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2016-01-01

    Full Text Available Facial animation is one of the most popular 3D animation topics researched in recent years. However, when using facial animation, a 3D facial animation model has to be stored. This 3D facial animation model requires many triangles to accurately describe and demonstrate facial expression animation because the face often presents a number of different expressions. Consequently, the costs associated with facial animation have increased rapidly. In an effort to reduce storage costs, researchers have sought to simplify 3D animation models using techniques such as Deformation Sensitive Decimation and Feature Edge Quadric. The studies conducted have examined the problems in the homogeneity of the local coordinate system between different expression models and in the retainment of simplified model characteristics. This paper proposes a method that applies Homogeneous Coordinate Transformation Matrix to solve the problem of homogeneity of the local coordinate system and Maximum Shape Operator to detect shape changes in facial animation so as to properly preserve the features of facial expressions. Further, root mean square error and perceived quality error are used to compare the errors generated by different simplification methods in experiments. Experimental results show that, compared with Deformation Sensitive Decimation and Feature Edge Quadric, our method can not only reduce the errors caused by simplification of facial animation, but also retain more facial features.

  5. Comparative analysis of storage conditions and homogenization methods for tick and flea species for identification by MALDI-TOF MS.

    Science.gov (United States)

    Nebbak, A; El Hamzaoui, B; Berenger, J-M; Bitam, I; Raoult, D; Almeras, L; Parola, P

    2017-12-01

    Ticks and fleas are vectors for numerous human and animal pathogens. Controlling them, which is important in combating such diseases, requires accurate identification, to distinguish between vector and non-vector species. Recently, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was applied to the rapid identification of arthropods. The growth of this promising tool, however, requires guidelines to be established. To this end, standardization protocols were applied to species of Rhipicephalus sanguineus (Ixodida: Ixodidae) Latreille and Ctenocephalides felis felis (Siphonaptera: Pulicidae) Bouché, including the automation of sample homogenization using two homogenizer devices, and varied sample preservation modes for a period of 1-6 months. The MS spectra were then compared with those obtained from manual pestle grinding, the standard homogenization method. Both automated methods generated intense, reproducible MS spectra from fresh specimens. Frozen storage methods appeared to represent the best preservation mode, for up to 6 months, while storage in ethanol is also possible, with some caveats for tick specimens. Carnoy's buffer, however, was shown to be less compatible with MS analysis for the purpose of identifying ticks or fleas. These standard protocols for MALDI-TOF MS arthropod identification should be complemented by additional MS spectrum quality controls, to generalize their use in monitoring arthropods of medical interest. © 2017 The Royal Entomological Society.

  6. Measurements of the mirror surface homogeneity in the CBM-RICH

    Energy Technology Data Exchange (ETDEWEB)

    Lebedeva, Elena; Hoehne, Claudia [II. Physikalisches Institut, JLU Giessen (Germany); Collaboration: CBM-Collaboration

    2016-07-01

    The Compressed Baryonic Matter (CBM) experiment at the future FAIR (Facility for Antiproton and Ion Research) complex will investigate the phase diagram of strongly interacting matter at high baryon densities and moderate temperatures in A+A collisions from 2-11 AGeV (SIS100) beam energy. One of the key detector components required for the CBM physics program is the RICH (Ring Imaging CHerenkov) detector, which is developed for efficient and clean electron identification and pion suppression. The CBM-RICH detector is being planned with gaseous radiator and in a standard projective geometry with focusing mirror elements and photon detector planes. One of the important criteria for the selection of appropriate mirrors is their optical surface quality (surface homogeneity). It defines the imaging quality of projected Cherenkov rings, and directly effects the ring finding and fitting performance. The global homogeneity has been tested with the D0 measurement. Local deformations e.g. by the mirror holding structure can be investigated with the Ronchi test and Shack-Hartmann method from which first results are discussed in this contribution.

  7. Nuclear physics methods in materials research

    International Nuclear Information System (INIS)

    Bethge, K.; Baumann, H.; Jex, H.; Rauch, F.

    1980-01-01

    Proceedings of the seventh divisional conference of the Nuclear Physics Division held at Darmstadt, Germany, from 23rd through 26th of September, 1980. The scope of this conference was defined as follows: i) to inform solid state physicists and materials scientists about the application of nuclear physics methods; ii) to show to nuclear physicists open questions and problems in solid state physics and materials science to which their methods can be applied. According to the intentions of the conference, the various nuclear physics methods utilized in solid state physics and materials science and especially new developments were reviewed by invited speakers. Detailed aspects of the methods and typical examples extending over a wide range of applications were presented as contributions in poster sessions. The Proceedings contain all the invited papers and about 90% of the contributed papers. (orig./RW)

  8. Advances in methods of commercial FBR core characteristics analyses. Investigations of a treatment of the double-heterogeneity and a method to calculate homogenized control rod cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Sugino, Kazuteru [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Iwai, Takehiko

    1998-07-01

    A standard data base for FBR core nuclear design is under development in order to improve the accuracy of FBR design calculation. As a part of the development, we investigated an improved treatment of double-heterogeneity and a method to calculate homogenized control rod cross sections in a commercial reactor geometry, for the betterment of the analytical accuracy of commercial FBR core characteristics. As an improvement in the treatment of double-heterogeneity, we derived a new method (the direct method) and compared both this and conventional methods with continuous energy Monte-Carlo calculations. In addition, we investigated the applicability of the reaction rate ratio preservation method as a advanced method to calculate homogenized control rod cross sections. The present studies gave the following information: (1) An improved treatment of double-heterogeneity: for criticality the conventional method showed good agreement with Monte-Carlo result within one sigma standard deviation; the direct method was consistent with conventional one. Preliminary evaluation of effects in core characteristics other than criticality showed that the effect of sodium void reactivity (coolant reactivity) due to the double-heterogeneity was large. (2) An advanced method to calculate homogenize control rod cross sections: for control rod worths the reaction rate ratio preservation method agreed with those produced by the calculations with the control rod heterogeneity included in the core geometry; in Monju control rod worth analysis, the present method overestimated control rod worths by 1 to 2% compared with the conventional method, but these differences were caused by more accurate model in the present method and it is considered that this method is more reliable than the conventional one. These two methods investigated in this study can be directly applied to core characteristics other than criticality or control rod worth. Thus it is concluded that these methods will

  9. Diffusion piecewise homogenization via flux discontinuity ratios

    International Nuclear Information System (INIS)

    Sanchez, Richard; Dante, Giorgio; Zmijarevic, Igor

    2013-01-01

    We analyze piecewise homogenization with flux-weighted cross sections and preservation of averaged currents at the boundary of the homogenized domain. Introduction of a set of flux discontinuity ratios (FDR) that preserve reference interface currents leads to preservation of averaged region reaction rates and fluxes. We consider the class of numerical discretizations with one degree of freedom per volume and per surface and prove that when the homogenization and computing meshes are equal there is a unique solution for the FDRs which exactly preserve interface currents. For diffusion sub-meshing we introduce a Jacobian-Free Newton-Krylov method and for all cases considered obtain an 'exact' numerical solution (eight digits for the interface currents). The homogenization is completed by extending the familiar full assembly homogenization via flux discontinuity factors to the sides of regions laying on the boundary of the piecewise homogenized domain. Finally, for the familiar nodal discretization we numerically find that the FDRs obtained with no sub-mesh (nearly at no cost) can be effectively used for whole-core diffusion calculations with sub-mesh. This is not the case, however, for cell-centered finite differences. (authors)

  10. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2003-12-01

    Full Text Available Abstract Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium.

  11. Testing the ISP method with the PARIO device: Accuracy of results and influence of homogenization technique

    Science.gov (United States)

    Durner, Wolfgang; Huber, Magdalena; Yangxu, Li; Steins, Andi; Pertassek, Thomas; Göttlein, Axel; Iden, Sascha C.; von Unold, Georg

    2017-04-01

    The particle-size distribution (PSD) is one of the main properties of soils. To determine the proportions of the fine fractions silt and clay, sedimentation experiments are used. Most common are the Pipette and Hydrometer method. Both need manual sampling at specific times. Both are thus time-demanding and rely on experienced operators. Durner et al. (Durner, W., S.C. Iden, and G. von Unold (2017): The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830) recently developed the integral suspension method (ISP) method, which is implemented in the METER Group device PARIOTM. This new method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel. The aim of this study was to test the precision and accuracy of new method with a variety of materials, to answer the following research questions: (1) Are the results obtained by PARIO reliable and stable? (2) Are the results affected by the initial mixing technique to homogenize the suspension, or by the presence of sand in the experiment? (3) Are the results identical to the one that are obtained with the Pipette method as reference method? The experiments were performed with a pure quartz silt material and four real soil materials. PARIO measurements were done repetitively on the same samples in a temperature-controlled lab to characterize the repeatability of the measurements. Subsequently, the samples were investigated by the pipette method to validate the results. We found that the statistical error for silt fraction from replicate and repetitive measurements was in the range of 1% for the quartz material to 3% for soil materials. Since the sand fractions, as in any sedimentation method, must

  12. Homogenization of aligned “fuzzy fiber” composites

    KAUST Repository

    Chatzigeorgiou, George

    2011-09-01

    The aim of this work is to study composites in which carbon fibers coated with radially aligned carbon nanotubes are embedded in a matrix. The effective properties of these composites are identified using the asymptotic expansion homogenization method in two steps. Homogenization is performed in different coordinate systems, the cylindrical and the Cartesian, and a numerical example are presented. © 2011 Elsevier Ltd. All rights reserved.

  13. OECD/NEA benchmark for time-dependent neutron transport calculations without spatial homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Jason, E-mail: jason.hou@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Ivanov, Kostadin N. [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Boyarinov, Victor F.; Fomichenko, Peter A. [National Research Centre “Kurchatov Institute”, Kurchatov Sq. 1, Moscow (Russian Federation)

    2017-06-15

    Highlights: • A time-dependent homogenization-free neutron transport benchmark was created. • The first phase, known as the kinetics phase, was described in this work. • Preliminary results for selected 2-D transient exercises were presented. - Abstract: A Nuclear Energy Agency (NEA), Organization for Economic Co-operation and Development (OECD) benchmark for the time-dependent neutron transport calculations without spatial homogenization has been established in order to facilitate the development and assessment of numerical methods for solving the space-time neutron kinetics equations. The benchmark has been named the OECD/NEA C5G7-TD benchmark, and later extended with three consecutive phases each corresponding to one modelling stage of the multi-physics transient analysis of the nuclear reactor core. This paper provides a detailed introduction of the benchmark specification of Phase I, known as the “kinetics phase”, including the geometry description, supporting neutron transport data, transient scenarios in both two-dimensional (2-D) and three-dimensional (3-D) configurations, as well as the expected output parameters from the participants. Also presented are the preliminary results for the initial state 2-D core and selected transient exercises that have been obtained using the Monte Carlo method and the Surface Harmonic Method (SHM), respectively.

  14. The General Theory of Homogenization A Personalized Introduction

    CERN Document Server

    Tartar, Luc

    2010-01-01

    Homogenization is not about periodicity, or Gamma-convergence, but about understanding which effective equations to use at macroscopic level, knowing which partial differential equations govern mesoscopic levels, without using probabilities (which destroy physical reality); instead, one uses various topologies of weak type, the G-convergence of Sergio Spagnolo, the H-convergence of Francois Murat and the author, and some responsible for the appearance of nonlocal effects, which many theories in continuum mechanics or physics guessed wrongly. For a better understanding of 20th century science,

  15. Monte Carlo Methods in Physics

    International Nuclear Information System (INIS)

    Santoso, B.

    1997-01-01

    Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained

  16. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  17. Hypersurface-homogeneous Universe filled with perfect fluid in f ( R ...

    Indian Academy of Sciences (India)

    homogeneous Universe filled with perfect fluid in the framework of f ( R , T ) theory of gravity (Harko et al, \\emph{Phys. Rev.} D 84, 024020 (2011)) is derived. The physical behaviour of the cosmological model is studied.

  18. Spontaneous compactification to homogeneous spaces

    International Nuclear Information System (INIS)

    Mourao, J.M.

    1988-01-01

    The spontaneous compactification of extra dimensions to compact homogeneous spaces is studied. The methods developed within the framework of coset space dimensional reduction scheme and the most general form of invariant metrics are used to find solutions of spontaneous compactification equations

  19. The coherent state on SUq(2) homogeneous space

    International Nuclear Information System (INIS)

    Aizawa, N; Chakrabarti, R

    2009-01-01

    The generalized coherent states for quantum groups introduced by Jurco and StovIcek are studied for the simplest example SU q (2) in full detail. It is shown that the normalized SU q (2) coherent states enjoy the property of completeness, and allow a resolution of the unity. This feature is expected to play a key role in the application of these coherent states in physical models. The homogeneous space of SU q (2), i.e. the q-sphere of Podles, is reproduced in complex coordinates by using the coherent states. Differential calculus in the complex form on the homogeneous space is developed. The high spin limit of the SU q (2) coherent states is also discussed.

  20. A Method for Ferulic Acid Production from Rice Bran Oil Soapstock Using a Homogenous System

    Directory of Open Access Journals (Sweden)

    Hoa Thi Truong

    2017-08-01

    Full Text Available Ferulic acid (FA is widely used as an antioxidant, e.g., as a Ultraviolet (UV protectant in cosmetics and in various medical applications. It has been produced by the hydrolysis of γ-oryzanol found in rice bran oil soapstock. In this study, the base-catalyzed, homogenous hydrolysis of γ-oryzanol was conducted using various ratios of potassium hydroxide (KOH to γ-oryzanol, initial concentrations of γ-oryzanol in the reaction mixture, and ratios of ethanol (EtOH (as cosolvent/ethyl acetate (EtOAc (γ-oryzanol solution. Acceleration of the reaction using a planar type of ultrasound sonicator (78 and 130 kHz at different reaction temperatures was explored. By using a heating method, the 80% yield of FA was attained at 75 °C in 4 h under homogeneous conditions (initial concentration of γ-oryzanol 12 mg/mL, the KOH/γ-oryzanol ratio (wt/wt 10/1, and EtOH/EtOAc ratio (v/v 5/1. With the assistance of 78 and 130 kHz irradiation, the yields reached 90%. The heating method was applied for the γ-oryzanol-containing extract prepared from rice bran oil soapstock. From soapstock, the 74.3% yield of FA was obtained, but 20% of the trans-FA in the reaction mixture was transformed into cis-form within one month.

  1. Towards a real time computation of the dose in a phantom segmented into homogeneous meshes

    International Nuclear Information System (INIS)

    Blanpain, B.

    2009-10-01

    Automatic radiation therapy treatment planning necessitates a very fast computation of the dose delivered to the patient. We propose to compute the dose by segmenting the patient's phantom into homogeneous meshes, and by associating, to the meshes, projections to dose distributions pre-computed in homogeneous phantoms, along with weights managing heterogeneities. The dose computation is divided into two steps. The first step impacts the meshes: projections and weights are set according to physical and geometrical criteria. The second step impacts the voxels: the dose is computed by evaluating the functions previously associated to their mesh. This method is very fast, in particular when there are few points of interest (several hundreds). In this case, results are obtained in less than one second. With such performances, practical realization of automatic treatment planning becomes practically feasible. (author)

  2. Preparation of the new certified reference material Virginia Tobacco Leaves (CTA-VTL-2) and development of suitable methods for checking the homogeneity

    International Nuclear Information System (INIS)

    Dybczynski, R.; Polkowska-Motrenko, H.; Samczynski, Z.; Szopa, Z.

    1994-01-01

    The aim of this project in the long run has been reparation of a new biological reference material: Tobacco leaves of the 'Virginia' type and its certification for the content of possibly great number of trace elements. Further aims have been: development of the suitable methods for checking the homogeneity with the special emphasis on homogeneity of small samples and the critical analysis of the performance of various analytical techniques

  3. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    Science.gov (United States)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  4. An analysis of the Rose's shim method for improvement of magnetic field homogeneity

    International Nuclear Information System (INIS)

    Ban, Etsuo

    1981-01-01

    Well known Rose's method has been applied to the magnets requiring high homogeneity (e.g. for magnetic resonance). The analysis of the Rose's shim is based on the conformal representation, and it is applicable to the poles of any form obtained by the combination of polygons. It provides rims for the magnetic poles of 90 deg edges. In this paper, the solution is determined by the elliptic function to give the magnetic field at any point in the space, directly integrating by the Schwarz-Christoffel transformation, instead of the approximate numerical integration employed by Rose, and compared with the example having applied it to a cylindrical pole. For the conditions of Rose's optimum correction, the exact solution is given as the case that the parameters of Jacobi's third kind elliptic function are equal to a half of first kind perfect elliptic integral. Since Rose depended on the approximate numerical integration, Rose's diagram showed a little insufficient correction. It was found that the pole shape giving excess correction of 10 -4 or so produced a good result for the cylindrical magnetic pole having the ratio of pole diameter to gap length of 2.5. In order to obtain the correction by which the change in homogeneity is small up to considerably intense field, the pole edges are required to be of curved surfaces. (Wakatsuki, Y.)

  5. Jaws calibration method to get a homogeneous distribution of dose in the junction of hemi fields

    International Nuclear Information System (INIS)

    Cenizo de Castro, E.; Garcia Pareja, S.; Moreno Saiz, C.; Hernandez Rodriguez, R.; Bodineau Gil, C.; Martin-Viera Cueto, J. A.

    2011-01-01

    Hemi fields treatments are widely used in radiotherapy. Because the tolerance established for the positioning of each jaw is 1 mm, may be cases of overlap or separation of up to 2 mm. This implies heterogeneity of doses up to 40% in the joint area. This paper presents an accurate method of calibration of the jaws so as to obtain homogeneous dose distributions when using this type of treatment. (Author)

  6. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  7. Statistical methods for physical science

    CERN Document Server

    Stanford, John L

    1994-01-01

    This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s

  8. Evolving self-assembly in autonomous homogeneous robots: experiments with two physical robots.

    Science.gov (United States)

    Ampatzis, Christos; Tuci, Elio; Trianni, Vito; Christensen, Anders Lyhne; Dorigo, Marco

    2009-01-01

    This research work illustrates an approach to the design of controllers for self-assembling robots in which the self-assembly is initiated and regulated by perceptual cues that are brought forth by the physical robots through their dynamical interactions. More specifically, we present a homogeneous control system that can achieve assembly between two modules (two fully autonomous robots) of a mobile self-reconfigurable system without a priori introduced behavioral or morphological heterogeneities. The controllers are dynamic neural networks evolved in simulation that directly control all the actuators of the two robots. The neurocontrollers cause the dynamic specialization of the robots by allocating roles between them based solely on their interaction. We show that the best evolved controller proves to be successful when tested on a real hardware platform, the swarm-bot. The performance achieved is similar to the one achieved by existing modular or behavior-based approaches, also due to the effect of an emergent recovery mechanism that was neither explicitly rewarded by the fitness function, nor observed during the evolutionary simulation. Our results suggest that direct access to the orientations or intentions of the other agents is not a necessary condition for robot coordination: Our robots coordinate without direct or explicit communication, contrary to what is assumed by most research works in collective robotics. This work also contributes to strengthening the evidence that evolutionary robotics is a design methodology that can tackle real-world tasks demanding fine sensory-motor coordination.

  9. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan

    2016-06-06

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  10. Homogeneous Biosensing Based on Magnetic Particle Labels

    Science.gov (United States)

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  11. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschö pe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  12. Formulation and Physical Test of Ethanolic Extract Sambiloto Leaves (Andrographis paniculata Ointment

    Directory of Open Access Journals (Sweden)

    Indri Kusuma Dewi

    2016-01-01

    Full Text Available Introduction: Andrographis paniculata had known contained active substance such as andrographolida, atsiri oil, flavonoid, tannin, alkaloid, and saponin which functions as antibacterial, antitoxic, analgesic, and anti-pyretic. Based on its antibacterial properties, an ointment form of Andrographis paniculata leaf is then formulated, for practical usage. Ointment formula is semi solid for topical use in the skin or mucosal membranes. Objectives: to know the result of ointment physical test of etanolic extract Andrographis paniculata leaf. Methods: physical tests contain organoleptic test, pH, homogenity, adhesion test and dispersive test. Results: Ointment shaped semi solid, blacky green and special smell of Andrographis paniculata, pH 6, homogenity test obtained homogenous results, adhesion test was 82 second and dispersive test was 5.6 cm. Conclusion: the result of physical tests of etanolic Andrographis paniculata leaf extract ointment appropriate with quality standard.

  13. Evolving controllers for a homogeneous system of physical robots: structured cooperation with minimal sensors.

    Science.gov (United States)

    Quinn, Matt; Smith, Lincoln; Mayley, Giles; Husbands, Phil

    2003-10-15

    We report on recent work in which we employed artificial evolution to design neural network controllers for small, homogeneous teams of mobile autonomous robots. The robots were evolved to perform a formation-movement task from random starting positions, equipped only with infrared sensors. The dual constraints of homogeneity and minimal sensors make this a non-trivial task. We describe the behaviour of a successful system in which robots adopt and maintain functionally distinct roles in order to achieve the task. We believe this to be the first example of the use of artificial evolution to design coordinated, cooperative behaviour for real robots.

  14. Numerical analysis for Darcy-Forchheimer flow in presence of homogeneous-heterogeneous reactions

    Directory of Open Access Journals (Sweden)

    Muhammad Ijaz Khan

    Full Text Available A mathematical study is presented to investigate the influences of homogeneous and heterogeneous reactions in local similar flow caused by stretching sheet with a non-linear velocity and variable thickness. Porous medium effects are characterized by using Darcy-Forchheimer porous-media. A simple isothermal model of homogeneous-heterogeneous reactions is used. The multiphysical boundary value problem is dictated by ten thermophysical parameters: ratio of mass diffusion coefficients, Prandtl number, local inertia coefficient parameter, inverse Darcy number, shape parameter, surface thickness parameter, Hartman number, Homogeneous heat reaction, strength of homogeneous-heterogeneous reactions and Schmidt number. Resulting systems are computed by Runge-Kutta-Fehlberg method. Different shapes of velocity are noticed for n > 1 and n < 1. Keywords: Homogeneous-heterogeneous reactions, Non Darcy porous medium, Variable sheet thickness, Homogeneous heat reaction with stoichiometric coefficient, Runge-Kutta-Fehlberg method

  15. Preparation of homogeneous isotopic targets with rotating substrate

    International Nuclear Information System (INIS)

    Xu, G.J.; Zhao, Z.G.

    1993-01-01

    Isotopically enriched accelerator targets were prepared using the evaporation-condensation method from a resistance heating crucible. For high collection efficiency and good homogeneity the substrate was rotated at a vertical distance of 1.3 to 2.5 cm from the evaporation source. Measured collection efficiencies were 13 to 51 μg cm -2 mg -1 and homogeneity tests showed values close to the theoretically calculated ones for a point source. Targets, selfsupporting or on backings, could be fabricated with this method for elements and some compounds with evaporation temperatures up to 2300 K. (orig.)

  16. Transonic flow of steam with non-equilibrium and homogenous condensation

    Science.gov (United States)

    Virk, Akashdeep Singh; Rusak, Zvi

    2017-11-01

    A small-disturbance model for studying the physical behavior of a steady transonic flow of steam with non-equilibrium and homogeneous condensation around a thin airfoil is derived. The steam thermodynamic behavior is described by van der Waals equation of state. The water condensation rate is calculated according to classical nucleation and droplet growth models. The current study is based on an asymptotic analysis of the fluid flow and condensation equations and boundary conditions in terms of the small thickness of the airfoil, small angle of attack, closeness of upstream flow Mach number to unity and small amount of condensate. The asymptotic analysis gives the similarity parameters that govern the problem. The flow field may be described by a non-homogeneous transonic small-disturbance equation coupled with a set of four ordinary differential equations for the calculation of the condensate mass fraction. An iterative numerical scheme which combines Murman & Cole's (1971) method with Simpson's integration rule is applied to solve the coupled system of equations. The model is used to study the effects of energy release from condensation on the aerodynamic performance of airfoils operating at high pressures and temperatures and near the vapor-liquid saturation conditions.

  17. Reactor physics methods development at Westinghouse

    International Nuclear Information System (INIS)

    Mueller, E.; Mayhue, L.; Zhang, B.

    2007-01-01

    The current state of reactor physics methods development at Westinghouse is discussed. The focus is on the methods that have been or are under development within the NEXUS project which was launched a few years ago. The aim of this project is to merge and modernize the methods employed in the PWR and BWR steady-state reactor physics codes of Westinghouse. (author)

  18. Relationship between physical development and physical readiness among skilled wrestlers

    Directory of Open Access Journals (Sweden)

    Yura Tropin

    2018-02-01

    Full Text Available Purpose: to determine the relationship between physical development and physical readiness among qualified wrestlers. Material & Methods: in the study involved thirty qualified wrestlers, aged 19–22 years. For the purpose of analyzing indicators of physical development and physical preparedness, pedagogical testing. Results: the results of the study testify to the homogeneity of the indices of physical development of the athletes under study, the coefficient of variation is in the range from 2,43% to 10,93%. It is revealed that the indices of physical readiness of qualified wrestlers are characterized mainly by small variation in the testing of speed-strength qualities, coordination abilities, general and strength endurance, and average – in the results of special endurance. Conclusion: it is determined that the most informative indicators of physical development are the weight of the wrestler's body, which has a connection with 15 physical preparedness tests, followed by a vital index (12 statistically reliable relationships and a strength index (11 interrelations.

  19. Homogenization models for thin rigid structured surfaces and films.

    Science.gov (United States)

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  20. Numerical simulation of homogenization time measurement by probes with different volume size

    International Nuclear Information System (INIS)

    Thyn, J.; Novy, M.; Zitny, R.; Mostek, M.; Jahoda, M.

    2004-01-01

    Results of continuous homogenization time measurement of liquid in a stirred tank depend on the scale of scrutiny. Experimental techniques use a probe, which is situated inside as a conductivity method, or outside of the tank as in the case of gamma-radiotracer methods. Expected value of homogenization time evaluated for a given degree of homogenization is higher when using the conductivity method because the conductivity probe measures relatively small volume in contrast to application of radiotracer, when the volume is much greater. Measurement through the wall of tank is a great advantage of radiotracer application but a comparison of the results with another method supposes a determination of measured volume, which is not easy. Simulation of measurement by CFD code can help to solve the problem. Methodology for CFD simulation of radiotracer experiments was suggested. Commercial software was used for simulation of liquid homogenization in mixed vessel with Rushton turbine. Numerical simulation of liquid homogenization time by CFD for different values of detected volume was confronted with measurement of homogenization time with conductivity probe and with different radioisotopes 198 Au, 82 Br and 24 Na. Detected size of the tank volume was affected by different energy of radioisotope used. (author)

  1. 7 CFR 58.920 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.920 Section 58.920 Agriculture... Procedures § 58.920 Homogenization. Where applicable concentrated products shall be homogenized for the... homogenization and the pressure at which homogenization is accomplished will be that which accomplishes the most...

  2. Regionalization Study of Satellite based Hydrological Model (SHM) in Hydrologically Homogeneous River Basins of India

    Science.gov (United States)

    Kumari, Babita; Paul, Pranesh Kumar; Singh, Rajendra; Mishra, Ashok; Gupta, Praveen Kumar; Singh, Raghvendra P.

    2017-04-01

    A new semi-distributed conceptual hydrological model, namely Satellite based Hydrological Model (SHM), has been developed under 'PRACRITI-2' program of Space Application Centre (SAC), Ahmedabad for sustainable water resources management of India by using data from Indian Remote Sensing satellites. Entire India is divided into 5km x 5km grid cells and properties at the center of the cells are assumed to represent the property of the cells. SHM contains five modules namely surface water, forest, snow, groundwater and routing. Two empirical equations (SCS-CN and Hargreaves) and water balance method have been used in the surface water module; the forest module is based on the calculations of water balancing & dynamics of subsurface. 2-D Boussinesq equation is used for groundwater modelling which is solved using implicit finite-difference. The routing module follows a distributed routing approach which requires flow path and network with the key point of travel time estimation. The aim of this study is to evaluate the performance of SHM using regionalization technique which also checks the usefulness of a model in data scarce condition or for ungauged basins. However, homogeneity analysis is pre-requisite to regionalization. Similarity index (Φ) and hierarchical agglomerative cluster analysis are adopted to test the homogeneity in terms of physical attributes of three basins namely Brahmani (39,033 km km^2)), Baitarani (10,982 km km^2)) and Kangsabati (9,660 km km^2)) with respect to Subarnarekha (29,196 km km^2)) basin. The results of both homogeneity analysis show that Brahmani basin is the most homogeneous with respect to Subarnarekha river basin in terms of physical characteristics (land use land cover classes, soiltype and elevation). The calibration and validation of model parameters of Brahmani basin is in progress which are to be transferred into the SHM set up of Subarnarekha basin and results are to be compared with the results of calibrated and validated

  3. Radiotracer application in determining changes in cement mix homogeneity

    International Nuclear Information System (INIS)

    Breda, M.

    1979-01-01

    A small amount of cement labelled with 24 Na is added to the concrete mix and the relative activity of the mix is measured using a scintillation detector in preset points at different time intervals of the mixing process. The detector picks up information from a volume of 10 to 15 litres. The values characterize the degree of homogeneity of the cement component in the mix. Mathematical statistics methods are used for assessing mixing or the homogeneity changes. The technique is quick and simple and is used to advantage in determining the effect of the duration and method of transport of the cement mix on its homogeneity, and in monitoring the mixing process and determining the minimum mixing time for all types of concrete mix. (M.S.)

  4. Homogenization technique for strongly heterogeneous zones in research reactors

    International Nuclear Information System (INIS)

    Lee, J.T.; Lee, B.H.; Cho, N.Z.; Oh, S.K.

    1991-01-01

    This paper reports on an iterative homogenization method using transport theory in a one-dimensional cylindrical cell model developed to improve the homogenized cross sections fro strongly heterogeneous zones in research reactors. The flux-weighting homogenized cross sections are modified by a correction factor, the cell flux ratio under an albedo boundary condition. The albedo at the cell boundary is iteratively determined to reflect the geometry effects of the material properties of the adjacent cells. This method has been tested with a simplified core model of the Korea Multipurpose Research Reactor. The results demonstrate that the reaction rates of an off-center control shroud cell, the multiplication factor, and the power distribution of the reactor core are close to those of the fine-mesh heterogeneous transport model

  5. Bounded energy states in homogeneous turbulent shear flow: An alternative view

    Science.gov (United States)

    Bernard, Peter S.; Speziale, Charles G.

    1990-01-01

    The equilibrium structure of homogeneous turbulent shear flow is investigated from a theoretical standpoint. Existing turbulence models, in apparent agreement with physical and numerical experiments, predict an unbounded exponential time growth of the turbulent kinetic energy and dissipation rate; only the anisotropy tensor and turbulent time scale reach a structural equilibrium. It is shown that if vortex stretching is accounted for in the dissipation rate transport equation, then there can exist equilibrium solutions, with bounded energy states, where the turbulence production is balanced by its dissipation. Illustrative calculations are present for a k-epsilon model modified to account for vortex stretching. The calculations indicate an initial exponential time growth of the turbulent kinetic energy and dissipation rate for elapsed times that are as large as those considered in any of the previously conducted physical or numerical experiments on homogeneous shear flow. However, vortex stretching eventually takes over and forces a production-equals-dissipation equilibrium with bounded energy states. The validity of this result is further supported by an independent theoretical argument. It is concluded that the generally accepted structural equilibrium for homogeneous shear flow with unbounded component energies is in need of re-examination.

  6. Lattice Boltzmann model for three-dimensional decaying homogeneous isotropic turbulence

    International Nuclear Information System (INIS)

    Xu Hui; Tao Wenquan; Zhang Yan

    2009-01-01

    We implement a lattice Boltzmann method (LBM) for decaying homogeneous isotropic turbulence based on an analogous Galerkin filter and focus on the fundamental statistical isotropic property. This regularized method is constructed based on orthogonal Hermite polynomial space. For decaying homogeneous isotropic turbulence, this regularized method can simulate the isotropic property very well. Numerical studies demonstrate that the novel regularized LBM is a promising approximation of turbulent fluid flows, which paves the way for coupling various turbulent models with LBM

  7. Fabrication of ITO particles using a combination of a homogeneous precipitation method and a seeding technique and their electrical conductivity

    Directory of Open Access Journals (Sweden)

    Yoshio Kobayashi

    2015-09-01

    Full Text Available The present work proposes a method to fabricate indium tin oxide (ITO particles using precursor particles synthesized with a combination of a homogeneous precipitation method and a seeding technique, and it also describes their electronic conductivity properties. Seed nanoparticles were produced using a co-precipitation method with aqueous solutions of indium (III chloride, tin (IV chloride aqueous solution and sodium hydroxide. Three types of ITO nanoparticles were fabricated. The first type was fabricated using the co-precipitation method (c-ITO. The second and third types were fabricated using a homogeneous precipitation method with the seed nanoparticles (s-ITO and without seeds (n-ITO. The as-prepared precursor particles were annealed in air at 500 °C, and their crystal structures were cubic ITO. The c-ITO nanoparticles formed irregular-shaped agglomerates of nanoparticles. The n-ITO nanoparticles had a rectangular-parallelepiped or quasi-cubic structure. Most s-ITO nanoparticles had a quasi-cubic structure, and their size was larger than the n-ITO particles. The volume resistivities of the c-ITO, n-ITO and s-ITO powders decreased in that order because the regular-shaped particles were made to strongly contact with each other.

  8. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.; Efendiev, Yalchin R.; Li, Guanglian; Savatorova, Viktoria

    2015-01-01

    , Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point

  9. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhijie, E-mail: zhijie.xu@pnnl.gov [Computational Mathematics Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Joshi, Vineet [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Hu, Shenyang [Reactor Materials & Mechanical Design, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Paxton, Dean [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lavender, Curt [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Burkes, Douglas [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2016-04-01

    Low-enriched U-22at% Mo (U–10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U–10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  10. Metallographic Index-Based Quantification of the Homogenization State in Extrudable Aluminum Alloys

    Directory of Open Access Journals (Sweden)

    Panagiota I. Sarafoglou

    2016-05-01

    Full Text Available Extrudability of aluminum alloys of the 6xxx series is highly dependent on the microstructure of the homogenized billets. It is therefore very important to characterize quantitatively the state of homogenization of the as-cast billets. The quantification of the homogenization state was based on the measurement of specific microstructural indices, which describe the size and shape of the intermetallics and indicate the state of homogenization. The indices evaluated were the following: aspect ratio (AR, which is the ratio of the maximum to the minimum diameter of the particles, feret (F, which is the maximum caliper length, and circularity (C, which is a measure of how closely a particle resembles a circle in a 2D metallographic section. The method included extensive metallographic work and the measurement of a large number of particles, including a statistical analysis, in order to investigate the effect of homogenization time. Among the indices examined, the circularity index exhibited the most consistent variation with homogenization time. The lowest value of the circularity index coincided with the metallographic observation for necklace formation. Shorter homogenization times resulted in intermediate homogenization stages involving rounding of edges or particle pinching. The results indicated that the index-based quantification of the homogenization state could provide a credible method for the selection of homogenization process parameters towards enhanced extrudability.

  11. Matrix shaped pulsed laser deposition: New approach to large area and homogeneous deposition

    Energy Technology Data Exchange (ETDEWEB)

    Akkan, C.K.; May, A. [INM – Leibniz Institute for New Materials, CVD/Biosurfaces Group, Campus D2 2, 66123 Saarbrücken (Germany); Hammadeh, M. [Department for Obstetrics, Gynecology and Reproductive Medicine, IVF Laboratory, Saarland University Medical Center and Faculty of Medicine, Building 9, 66421 Homburg, Saar (Germany); Abdul-Khaliq, H. [Clinic for Pediatric Cardiology, Saarland University Medical Center and Faculty of Medicine, Building 9, 66421 Homburg, Saar (Germany); Aktas, O.C., E-mail: cenk.aktas@inm-gmbh.de [INM – Leibniz Institute for New Materials, CVD/Biosurfaces Group, Campus D2 2, 66123 Saarbrücken (Germany)

    2014-05-01

    Pulsed laser deposition (PLD) is one of the well-established physical vapor deposition methods used for synthesis of ultra-thin layers. Especially PLD is suitable for the preparation of thin films of complex alloys and ceramics where the conservation of the stoichiometry is critical. Beside several advantages of PLD, inhomogeneity in thickness limits use of PLD in some applications. There are several approaches such as rotation of the substrate or scanning of the laser beam over the target to achieve homogenous layers. On the other hand movement and transition create further complexity in process parameters. Here we present a new approach which we call Matrix Shaped PLD to control the thickness and homogeneity of deposited layers precisely. This new approach is based on shaping of the incoming laser beam by a microlens array and a Fourier lens. The beam is split into much smaller multi-beam array over the target and this leads to a homogenous plasma formation. The uniform intensity distribution over the target yields a very uniform deposit on the substrate. This approach is used to deposit carbide and oxide thin films for biomedical applications. As a case study coating of a stent which has a complex geometry is presented briefly.

  12. Nuclear methods in medical physics

    International Nuclear Information System (INIS)

    Jeraj, R.

    2003-01-01

    A common ground for both, reactor and medical physics is a demand for high accuracy of particle transport calculations. In reactor physics, safe operation of nuclear power plants has been asking for high accuracy of calculation methods. Similarly, dose calculation in radiation therapy for cancer has been requesting high accuracy of transport methods to ensure adequate dosimetry. Common to both problems has always been a compromise between achievable accuracy and available computer power leading into a variety of calculation methods developed over the decades. On the other hand, differences of subjects (nuclear reactor vs. humans) and radiation types (neutron/photon vs. photon/electron or ions) are calling for very field-specific approach. Nevertheless, it is not uncommon to see drift of researches from one field to another. Several examples from both fields will be given with the aim to compare the problems, indicating their similarities and discussing their differences. As examples of reactor physics applications, both deterministic and Monte Carlo calculations will be presented for flux distributions of the VENUS and TRIGA Mark II benchmark. These problems will be paralleled to medical physics applications in linear accelerator radiation field determination and dose distribution calculations. Applicability of the adjoint/forward transport will be discussed in the light of both transport problems. Boron neutron capture therapy (BNCT) as an example of the close collaboration between the fields will be presented. At last, several other examples from medical physics, which can and cannot find corresponding problems in reactor physics, will be discussed (e.g., beam optimisation in inverse treatment planning, imaging applications). (author)

  13. An integral equation method for the homogenization of unidirectional fibre-reinforced media; antiplane elasticity and other potential problems.

    Science.gov (United States)

    Joyce, Duncan; Parnell, William J; Assier, Raphaël C; Abrahams, I David

    2017-05-01

    In Parnell & Abrahams (2008 Proc. R. Soc. A 464 , 1461-1482. (doi:10.1098/rspa.2007.0254)), a homogenization scheme was developed that gave rise to explicit forms for the effective antiplane shear moduli of a periodic unidirectional fibre-reinforced medium where fibres have non-circular cross section. The explicit expressions are rational functions in the volume fraction. In that scheme, a (non-dilute) approximation was invoked to determine leading-order expressions. Agreement with existing methods was shown to be good except at very high volume fractions. Here, the theory is extended in order to determine higher-order terms in the expansion. Explicit expressions for effective properties can be derived for fibres with non-circular cross section, without recourse to numerical methods. Terms appearing in the expressions are identified as being associated with the lattice geometry of the periodic fibre distribution, fibre cross-sectional shape and host/fibre material properties. Results are derived in the context of antiplane elasticity but the analogy with the potential problem illustrates the broad applicability of the method to, e.g. thermal, electrostatic and magnetostatic problems. The efficacy of the scheme is illustrated by comparison with the well-established method of asymptotic homogenization where for fibres of general cross section, the associated cell problem must be solved by some computational scheme.

  14. Conformally compactified homogeneous spaces (Possible Observable Consequences)

    International Nuclear Information System (INIS)

    Budinich, P.

    1995-01-01

    Some arguments based on the possible spontaneous violation of the Cosmological Principles (represented by the observed large-scale structures of galaxies), the Cartan-geometry of simple spinors and on the Fock-formulation of hydrogen-atom wave-equation in momentum-space, are presented in favour of the hypothesis that space-time and momentum-space should be both conformally compactified and represented by the two four-dimensional homogeneous spaces of the conformal group, both isomorphic to (S 3 X S 1 )/Z 2 and correlated by conformal inversion. Within this framework, the possible common origin for the S0(4) symmetry underlying the geometrical structure of the Universe, of Kepler orbits and of the H-atom is discussed. On of the consequences of the proposed hypothesis could be that any quantum field theory should be naturally free from both infrared and ultraviolet divergences. But then physical spaces defined as those where physical phenomena may be best described, could be different from those homogeneous spaces. A simple, exactly soluble, toy model, valid for a two-dimensional space-time is presented where the conjecture conformally compactified space-time and momentum-space are both isomorphic to (S 1 X S 1 )/Z 2 , while the physical spaces are two finite lattice which are dual since Fourier transforms, represented by finite, discrete, sums may be well defined on them. Furthermore, a q-deformed SU q (1,1) may be represented on them if q is a root of unity. (author). 22 refs, 3 figs

  15. Computational Methods in Plasma Physics

    CERN Document Server

    Jardin, Stephen

    2010-01-01

    Assuming no prior knowledge of plasma physics or numerical methods, Computational Methods in Plasma Physics covers the computational mathematics and techniques needed to simulate magnetically confined plasmas in modern magnetic fusion experiments and future magnetic fusion reactors. Largely self-contained, the text presents the basic concepts necessary for the numerical solution of partial differential equations. Along with discussing numerical stability and accuracy, the author explores many of the algorithms used today in enough depth so that readers can analyze their stability, efficiency,

  16. Setting up and validating a complex model for a simple homogeneous wall

    DEFF Research Database (Denmark)

    Naveros, I.; Bacher, Peder; Ruiz, D. P.

    2014-01-01

    the regression averages method for estimation of parameters which describe the thermal behaviour of the wall. Solar irradiance and long-wave radiation balance terms are added in the heat balance equation besides modelling of wind speed effect to achieve a complete description of the relevant phenomena which......The present paper describes modelling of the thermal dynamics of a real wall tested in dynamic outdoor weather conditions, to identify all the parameters needed for its characterisation. Specifically, the U value, absorptance and effective heat capacity are estimated for the wall using grey......-box modelling based on statistical methods and known physical dynamic energy balance equations, related to the heat flux density through a simple and homogeneous wall. The experimental test was carried out in a hot-temperature climate for nine months. This study aims at proposing a dynamic method improving...

  17. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Science.gov (United States)

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  18. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Directory of Open Access Journals (Sweden)

    Preston Donovan

    Full Text Available The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  19. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    Science.gov (United States)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  20. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  1. Homogeneous Field and WKB Approximation in Deformed Quantum Mechanics with Minimal Length

    Directory of Open Access Journals (Sweden)

    Jun Tao

    2015-01-01

    Full Text Available In the framework of the deformed quantum mechanics with a minimal length, we consider the motion of a nonrelativistic particle in a homogeneous external field. We find the integral representation for the physically acceptable wave function in the position representation. Using the method of steepest descent, we obtain the asymptotic expansions of the wave function at large positive and negative arguments. We then employ the leading asymptotic expressions to derive the WKB connection formula, which proceeds from classically forbidden region to classically allowed one through a turning point. By the WKB connection formula, we prove the Bohr-Sommerfeld quantization rule up to Oβ2. We also show that if the slope of the potential at a turning point is too steep, the WKB connection formula is no longer valid around the turning point. The effects of the minimal length on the classical motions are investigated using the Hamilton-Jacobi method. We also use the Bohr-Sommerfeld quantization to study statistical physics in deformed spaces with the minimal length.

  2. Distribution of Al and in impurities along homogeneous Ge-Si crystals grown by the Czochralski method using Si feeding rod

    Science.gov (United States)

    Kyazimova, V. K.; Alekperov, A. I.; Zakhrabekova, Z. M.; Azhdarov, G. Kh.

    2014-05-01

    A distribution of Al and In impurities in Ge1 - x Si x crystals (0 ≤ x ≤ 0.3) grown by a modified Czochralski method (with continuous feeding of melt using a Si rod) have been studied experimentally and theoretically. Experimental Al and In concentrations along homogeneous crystals have been determined from Hall measurements. The problem of Al and In impurity distribution in homogeneous Ge-Si single crystals grown in the same way is solved within the Pfann approximation. A set of dependences of Al and In concentrations on the crystal length obtained within this approximation demonstrates a good correspondence between the experimental and theoretical data.

  3. Applications of high and ultra high pressure homogenization for food safety

    Directory of Open Access Journals (Sweden)

    Francesca Patrignani

    2016-08-01

    Full Text Available Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time (LTLT and high temperature short time (HTST treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure (HHP, pulsed electric field (PEF, ultrasound (US and high pressure homogenization (HPH. This last technique has been demonstrated to have a great potential to provide fresh-like products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of high pressure homogenization against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered

  4. Qualitative methods in theoretical physics

    CERN Document Server

    Maslov, Dmitrii

    2018-01-01

    This book comprises a set of tools which allow researchers and students to arrive at a qualitatively correct answer without undertaking lengthy calculations. In general, Qualitative Methods in Theoretical Physics is about combining approximate mathematical methods with fundamental principles of physics: conservation laws and symmetries. Readers will learn how to simplify problems, how to estimate results, and how to apply symmetry arguments and conduct dimensional analysis. A comprehensive problem set is included. The book will appeal to a wide range of students and researchers.

  5. Homogenized group cross sections by Monte Carlo

    International Nuclear Information System (INIS)

    Van Der Marck, S. C.; Kuijper, J. C.; Oppe, J.

    2006-01-01

    Homogenized group cross sections play a large role in making reactor calculations efficient. Because of this significance, many codes exist that can calculate these cross sections based on certain assumptions. However, the application to the High Flux Reactor (HFR) in Petten, the Netherlands, the limitations of such codes imply that the core calculations would become less accurate when using homogenized group cross sections (HGCS). Therefore we developed a method to calculate HGCS based on a Monte Carlo program, for which we chose MCNP. The implementation involves an addition to MCNP, and a set of small executables to perform suitable averaging after the MCNP run(s) have completed. Here we briefly describe the details of the method, and we report on two tests we performed to show the accuracy of the method and its implementation. By now, this method is routinely used in preparation of the cycle to cycle core calculations for HFR. (authors)

  6. Computational Method for Atomistic-Continuum Homogenization

    National Research Council Canada - National Science Library

    Chung, Peter

    2002-01-01

    ...." Physical Review Letters. vol. 61, no. 25, pp. 2879-2882, 19 December 1988; Brenner, D. W. "Empirical Potential for Hydrocarbons for Use in Simulating the Chemical Vapor Deposition of Diamond Films...

  7. A convenient procedure for magnetic field homogeneity evaluation

    International Nuclear Information System (INIS)

    Teles, J; Garrido, C E; Tannus, A

    2004-01-01

    In many areas of research that utilize magnetic fields in their studies, it is important to obtain fields with a spatial distribution as homogeneous as possible. A procedure usually utilized to evaluate and to optimize field homogeneity is the expansion of the measured field in spherical harmonic components. In addition to the methods proposed in the literature, we present a more convenient procedure for evaluation of field homogeneity inside a spherical volume. The procedure uses the orthogonality property of the spherical harmonics to find the field variance. It is shown that the total field variance is equal to the sum of the individual variances of each field component in the spherical harmonic expansion. Besides the advantages of the linear behaviour of the individual variances, there is the fact that the field variance and standard deviation are the best parameters to achieve global homogeneity field information

  8. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    Science.gov (United States)

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  9. Functionality and homogeneity.

    NARCIS (Netherlands)

    2011-01-01

    Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass,

  10. A Review on Homogeneous Charge Compression Ignition and Low Temperature Combustion by Optical Diagnostics

    Directory of Open Access Journals (Sweden)

    Chao Jin

    2015-01-01

    Full Text Available Optical diagnostics is an effective method to understand the physical and chemical reaction processes in homogeneous charge compression ignition (HCCI and low temperature combustion (LTC modes. Based on optical diagnostics, the true process on mixing, combustion, and emissions can be seen directly. In this paper, the mixing process by port-injection and direct-injection are reviewed firstly. Then, the combustion chemical reaction mechanism is reviewed based on chemiluminescence, natural-luminosity, and laser diagnostics. After, the evolution of pollutant emissions measured by different laser diagnostic methods is reviewed and the measured species including NO, soot, UHC, and CO. Finally, a summary and the future directions on HCCI and LTC used optical diagnostics are presented.

  11. Immobilisation of homogeneous olefin polymerisation catalysts. Factors influencing activity and stability

    NARCIS (Netherlands)

    Severn, J.R.; Chadwick, J.C.

    2013-01-01

    The activity and stability of homogeneous olefin polymerisation catalysts, when immobilised on a support, are dependent on both chemical and physical effects. Chemical factors affecting catalyst activity include the ease of formation of the active species, which is strongly dependent on the

  12. Homogenization of resonant chiral metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....

  13. Asymmetric Rolling Process Simulations by Dynamic Explicit Crystallographic Homogenized Finite Element Method

    International Nuclear Information System (INIS)

    Ngoc Tam, Nguyen; Nakamura, Yasunori; Terao, Toshihiro; Kuramae, Hiroyuki; Nakamachi, Eiji; Sakamoto, Hidetoshi; Morimoto, Hideo

    2007-01-01

    Recently, the asymmetric rolling (ASR) has been applied to the material processing of aluminum alloy sheet to control micro-crystal structure and texture in order to improve the mechanical properties. Previously, several studies aimed at high formability sheet generation have been carried out experimentally, but finite element simulations to predict the deformation induced texture evolution of the asymmetrically rolled sheet metals have not been investigated rigorously. In this study, crystallographic homogenized finite element (FE) codes are developed and applied to analyze the asymmetrical rolling processes. The textures of sheet metals were measured by electron back scattering diffraction (EBSD), and compared with FE simulations. The results from the dynamic explicit type Crystallographic homogenization FEM code shows that this type of simulation is a comprehensive tool to predict the plastic induced texture evolution

  14. Methods of modern mathematical physics

    CERN Document Server

    Reed, Michael

    1980-01-01

    This book is the first of a multivolume series devoted to an exposition of functional analysis methods in modern mathematical physics. It describes the fundamental principles of functional analysis and is essentially self-contained, although there are occasional references to later volumes. We have included a few applications when we thought that they would provide motivation for the reader. Later volumes describe various advanced topics in functional analysis and give numerous applications in classical physics, modern physics, and partial differential equations.

  15. Hardness and microstructure homogeneity of pure copper processed by accumulative back extrusion

    International Nuclear Information System (INIS)

    Bazaz, B.; Zarei-Hanzaki, A.; Fatemi-Varzaneh, S.M.

    2013-01-01

    The present work deals with the microstructure evolution of a pure copper processed by a new severe plastic deformation method. A set of pure copper (99.99%) work-pieces with coarse-grained microstructures was processed by accumulative back extrusion (ABE) method at room temperature. The optical and scanning electron microscopy (SEM) and hardness measurements were utilized to study the microstructural evolution and hardness homogeneity. The results indicated that ABE is a capable process to provide a homogenous grain refined microstructure in pure copper. The observed grain refinement was discussed relying on the occurrence of dynamic restoration processes. The analysis of microstructure and hardness showed outstanding homogeneity improvement throughout the work-pieces as the consecutive ABE passes were applied. The homogeneity improvement was attributed to the propagation of the shear bands and also the heavily deformed regions. A reversing route was also applied in the ABE processing to investigate its effect on the development of microstructural homogeneity. Comparing to the conventional route, the application of the reversing route was found to yield better homogeneity after less passes of the process.

  16. Multiscale models in mechano and tumor biology modeling, homogenization, and applications

    CERN Document Server

    Penta, Raimondo; Lang, Jens

    2017-01-01

    This book presents and discusses the state of the art and future perspectives in mathematical modeling and homogenization techniques with the focus on addressing key physiological issues in the context of multiphase healthy and malignant biological materials. The highly interdisciplinary content brings together contributions from scientists with complementary areas of expertise, such as pure and applied mathematicians, engineers, and biophysicists. The book also features the lecture notes from a half-day introductory course on asymptotic homogenization. These notes are suitable for undergraduate mathematics or physics students, while the other chapters are aimed at graduate students and researchers.

  17. Ultrasonic methods in solid state physics

    CERN Document Server

    Truell, John; Elbaum, Charles

    1969-01-01

    Ultrasonic Methods in Solid State Physics is devoted to studies of energy loss and velocity of ultrasonic waves which have a bearing on present-day problems in solid-state physics. The discussion is particularly concerned with the type of investigation that can be carried out in the megacycle range of frequencies from a few megacycles to kilomegacycles; it deals almost entirely with short-duration pulse methods rather than with standing-wave methods. The book opens with a chapter on a classical treatment of wave propagation in solids. This is followed by separate chapters on methods and techni

  18. Research Methods Identifying Correlation Between Physical Environment of Schools and Educational Paradigms

    Directory of Open Access Journals (Sweden)

    Grėtė Brukštutė

    2016-04-01

    Full Text Available The article is analysing the research that was already carried out in order to determine correlation between a physical environment of schools and educational paradigms. While selecting materials for the analysis, the attention was focused on studies conducted in the USA and European countries. Based on these studies the methodological attitudes towards coherence of the education and spatial structures were tried to identify. Homogeneity and conformity of an educational character and a physical learning environment became especially important during changes of educational conceptions. The issue how educational paradigms affect the architecture of school buildings is not yet analysed in Lithuania, therefore the results of this research could actualize a theme on correlation between educational paradigms and the architecture of school buildings and form initial guidelines for the development of the modern physical learning environment.

  19. Electromagnetic Radiation in a Uniformly Moving, Homogeneous Medium

    DEFF Research Database (Denmark)

    Johannsen, Günther

    1972-01-01

    A new method of treating radiation problems in a uniformly moving, homogeneous medium is presented. A certain transformation technique in connection with the four-dimensional Green's function method makes it possible to elaborate the Green's functions of the governing differential equations...

  20. Nuclear physics mathematical methods

    International Nuclear Information System (INIS)

    Balian, R.; Gervois, A.; Giannoni, M.J.; Levesque, D.; Maille, M.

    1984-01-01

    The nuclear physics mathematical methods, applied to the collective motion theory, to the reduction of the degrees of freedom and to the order and disorder phenomena; are investigated. In the scope of the study, the following aspects are discussed: the entropy of an ensemble of collective variables; the interpretation of the dissipation, applying the information theory; the chaos and the universality; the Monte-Carlo method applied to the classical statistical mechanics and quantum mechanics; the finite elements method, and the classical ergodicity [fr

  1. A new consistent definition of the homogenized diffusion coefficient of a lattice, limitations of the homogenization concept, and discussion of previously defined coefficients

    International Nuclear Information System (INIS)

    Deniz, V.C.

    1978-01-01

    The problem concerned with the correct definition of the homogenized diffusion coefficient of a lattice, and the concurrent problem of whether or not a homogenized diffusion equation can be formally set up, is studied by a space-energy angle dependent treatment for a general lattice cell; using an operator notation which applies to any eigen-problem. It is shown that the diffusion coefficient should represent only leakage effects. A new definition of the diffusion coefficient is given, which combines within itself the individual merits of each of the two definitions of Benoist, and reduces to the 'uncorrected' Benoist coefficient in certain cases. The conditions under which a homogenized diffusion equation can be obtained are discussed. A compatison is made between the approach via a diffusion equation and the approach via the eigen-coefficients of Deniz. Previously defined diffusion coefficients are discussed, and it is shown that the transformed eigen-coefficients proposed by Gelbard and by Larsen are unsuitable as diffusion coefficients, and that the cell-edge normalization of the Bonalumi coefficient is not physically justifiable. (author)

  2. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  3. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.

    2015-04-16

    Modeling porous flow in complex media is a challenging problem. Not only is the problem inherently multiscale but, due to high contrast in permeability values, flow velocities may differ greatly throughout the medium. To avoid complicated interface conditions, the Brinkman model is often used for such flows [O. Iliev, R. Lazarov, and J. Willems, Multiscale Model. Simul., 9 (2011), pp. 1350--1372]. Instead of permeability variations and contrast being contained in the geometric media structure, this information is contained in a highly varying and high-contrast coefficient. In this work, we present two main contributions. First, we develop a novel homogenization procedure for the high-contrast Brinkman equations by constructing correctors and carefully estimating the residuals. Understanding the relationship between scales and contrast values is critical to obtaining useful estimates. Therefore, standard convergence-based homogenization techniques [G. A. Chechkin, A. L. Piatniski, and A. S. Shamev, Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point is that the Brinkman equations, in certain scaling regimes, are invariant under homogenization. Unlike in the case of Stokes-to-Darcy homogenization [D. Brown, P. Popov, and Y. Efendiev, GEM Int. J. Geomath., 2 (2011), pp. 281--305, E. Marusic-Paloka and A. Mikelic, Boll. Un. Mat. Ital. A (7), 10 (1996), pp. 661--671], the results presented here under certain velocity regimes yield a Brinkman-to-Brinkman upscaling that allows using a single software platform to compute on both microscales and macroscales. In this paper, we discuss the homogenized Brinkman equations. We derive auxiliary cell problems to build correctors and calculate effective coefficients for certain velocity regimes. Due to the boundary effects, we construct

  4. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization.

    Science.gov (United States)

    Kwiatkowski, M; Wurlitzer, M; Krutilin, A; Kiani, P; Nimer, R; Omidi, M; Mannaa, A; Bussmann, T; Bartkowiak, K; Kruber, S; Uschold, S; Steffen, P; Lübberstedt, J; Küpker, N; Petersen, H; Knecht, R; Hansen, N O; Zarrine-Afsar, A; Robertson, W D; Miller, R J D; Schlüter, H

    2016-02-16

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Elastic full waveform inversion based on the homogenization method: theoretical framework and 2-D numerical illustrations

    Science.gov (United States)

    Capdeville, Yann; Métivier, Ludovic

    2018-05-01

    Seismic imaging is an efficient tool to investigate the Earth interior. Many of the different imaging techniques currently used, including the so-called full waveform inversion (FWI), are based on limited frequency band data. Such data are not sensitive to the true earth model, but to a smooth version of it. This smooth version can be related to the true model by the homogenization technique. Homogenization for wave propagation in deterministic media with no scale separation, such as geological media, has been recently developed. With such an asymptotic theory, it is possible to compute an effective medium valid for a given frequency band such that effective waveforms and true waveforms are the same up to a controlled error. In this work we make the link between limited frequency band inversion, mainly FWI, and homogenization. We establish the relation between a true model and an FWI result model. This relation is important for a proper interpretation of FWI images. We numerically illustrate, in the 2-D case, that an FWI result is at best the homogenized version of the true model. Moreover, it appears that the homogenized FWI model is quite independent of the FWI parametrization, as long as it has enough degrees of freedom. In particular, inverting for the full elastic tensor is, in each of our tests, always a good choice. We show how the homogenization can help to understand FWI behaviour and help to improve its robustness and convergence by efficiently constraining the solution space of the inverse problem.

  6. Homogenization and structural topology optimization theory, practice and software

    CERN Document Server

    Hassani, Behrooz

    1999-01-01

    Structural topology optimization is a fast growing field that is finding numerous applications in automotive, aerospace and mechanical design processes. Homogenization is a mathematical theory with applications in several engineering problems that are governed by partial differential equations with rapidly oscillating coefficients Homogenization and Structural Topology Optimization brings the two concepts together and successfully bridges the previously overlooked gap between the mathematical theory and the practical implementation of the homogenization method. The book is presented in a unique self-teaching style that includes numerous illustrative examples, figures and detailed explanations of concepts. The text is divided into three parts which maintains the book's reader-friendly appeal.

  7. Economical preparation of extremely homogeneous nuclear accelerator targets

    International Nuclear Information System (INIS)

    Maier, H.J.

    1983-01-01

    Techniques for target preparation with a minimum consumption of isotopic material are described. The rotating substrate method, which generates extremely homogeneous targets, is discussed in some detail

  8. Homogenization of food samples for gamma spectrometry using tetramethylammonium hydroxide and enzymatic digestion

    International Nuclear Information System (INIS)

    Kimi Nishikawa; Abdul Bari; Abdul Jabbar Khan; Xin Li; Traci Menia; Semkow, T.M.

    2017-01-01

    We have developed a method of food sample preparation for gamma spectrometry involving the use of tetramethylammonium hydroxide (TMAH) and/or enzymes such as α-amylase or cellulase for sample homogenization. We demonstrated the effectiveness of this method using food matrices spiked with "6"0Co, "1"3"1I, "1"3"4","1"3"7Cs, and "2"4"1Am radionuclides, homogenized with TMAH (mixed salad, parmesan cheese, and ground beef); enzymes (α-amylase for bread, and cellulase for baked beans); or α-amylase followed by TMAH (cheeseburgers). Procedures were developed which are best compromises between the degree of homogenization, accuracy, speed, and minimizing laboratory equipment contamination. Based on calculated sample biases and z-scores, our results suggest that homogenization using TMAH and enzymes would be a useful method of sample preparation for gamma spectrometry samples during radiological emergencies. (author)

  9. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  10. Homogeneous crystal nucleation in polymers.

    Science.gov (United States)

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  11. Converting homogeneous to heterogeneous in electrophilic catalysis using monodisperse metal nanoparticles.

    Science.gov (United States)

    Witham, Cole A; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N; Somorjai, Gabor A; Toste, F Dean

    2010-01-01

    A continuing goal in catalysis is to unite the advantages of homogeneous and heterogeneous catalytic processes. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this unification can also be supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl(2), and catalyse a range of π-bond activation reactions previously only catalysed through homogeneous processes. Multiple experimental methods are used to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, a size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared with larger, polymer-capped analogues.

  12. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  13. Homogeneous Photodynamical Analysis of Kepler's Multiply-Transiting Systems

    Science.gov (United States)

    Ragozzine, Darin

    To search for planets more like our own, NASA s Kepler Space Telescope ( Kepler ) discovered thousands of exoplanet candidates that cross in front of ( transit ) their parent stars (e.g., Twicken et al. 2016). The Kepler exoplanet data represent an incredible observational leap forward as evidenced by hundreds of papers with thousands of citations. In particular, systems with multiple transiting planets combine the determination of physical properties of exoplanets (e.g., radii), the context provided by the system architecture, and insights from orbital dynamics. Such systems are the most information-rich exoplanetary systems (Ragozzine & Holman 2010). Thanks to Kepler s revolutionary dataset, understanding these Multi-Transiting Systems (MTSs) enables a wide variety of major science questions. In conclusion, existing analyses of MTSs are incomplete and suboptimal and our efficient and timely proposal will provide significant scientific gains ( 100 new mass measurements and 100 updated mass measurements). Furthermore, our homogeneous analysis enables future statistical analyses, including those necessary to characterize the small planet mass-radius relation with implications for understanding the formation, evolution, and habitability of planets. The overarching goal of this proposal is a complete homogeneous investigation of Kepler MTSs to provide detailed measurements (or constraints) on exoplanetary physical and orbital properties. Current investigations do not exploit the full power of the Kepler data; here we propose to use better data (Short Cadence observations), better methods (photodynamical modeling), and a better statistical method (Bayesian Differential Evolution Markov Chain Monte Carlo) in a homogenous analysis of all 700 Kepler MTSs. These techniques are particularly valuable for understanding small terrestrial planets. We propose to extract the near-maximum amount of information from these systems through a series of three research objectives

  14. Ability Group Configuration for the High School Physics Classroom

    Science.gov (United States)

    Zitnik, Scott

    This research project looks to investigate the effectiveness of different ability grouping arrangements for the high school physics classroom. Students were first organized based on their academic aptitude in physics into three general groups of high, medium, and low achieving students. They were then divided into both groups of four and dyads that were constructed in one of four arrangements, namely: random, homogeneous, heterogeneous, or student choice. Data was collected based on their academic performance as well as survey responses regarding the group and dyad performance. Students worked in a rotation of these groups and dyads for a unit to measure student preference and introduce collaborative work formally to the classes. At this point it was evident that students preferred the student choice arrangement based on survey responses, yet the student choice survey responses also resulted in the lowest level of reliability when compared to all other grouping methods. For the next unit students were kept in either the random, homogeneous, or heterogeneous grouping arrangement for the entirety of the unit. At the conclusion of the second unit student achievement as well as survey responses were analyzed. As a result of this research there appears to be a slight student preference as well as academic benefit to homogeneous group and dyad arrangements for each of the three ability groups of students in the high school physics classroom when compared to random and heterogeneous grouping methods of academic group arrangement.

  15. Exact Polynomial Eigenmodes for Homogeneous Spherical 3-Manifolds

    OpenAIRE

    Weeks, Jeffrey R.

    2005-01-01

    Observational data hints at a finite universe, with spherical manifolds such as the Poincare dodecahedral space tentatively providing the best fit. Simulating the physics of a model universe requires knowing the eigenmodes of the Laplace operator on the space. The present article provides explicit polynomial eigenmodes for all globally homogeneous 3-manifolds: the Poincare dodecahedral space S3/I*, the binary octahedral space S3/O*, the binary tetrahedral space S3/T*, the prism manifolds S3/D...

  16. Stochastic model of milk homogenization process using Markov's chain

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2016-01-01

    Full Text Available The process of development of a mathematical model of the process of homogenization of dairy products is considered in the work. The theory of Markov's chains was used in the development of the mathematical model, Markov's chain with discrete states and continuous parameter for which the homogenisation pressure is taken, being the basis for the model structure. Machine realization of the model is implemented in the medium of structural modeling MathWorks Simulink™. Identification of the model parameters was carried out by minimizing the standard deviation calculated from the experimental data for each fraction of dairy products fat phase. As the set of experimental data processing results of the micrographic images of fat globules of whole milk samples distribution which were subjected to homogenization at different pressures were used. Pattern Search method was used as optimization method with the Latin Hypercube search algorithm from Global Optimization Тoolbox library. The accuracy of calculations averaged over all fractions of 0.88% (the relative share of units, the maximum relative error was 3.7% with the homogenization pressure of 30 MPa, which may be due to the very abrupt change in properties from the original milk in the particle size distribution at the beginning of the homogenization process and the lack of experimental data at homogenization pressures of below the specified value. The mathematical model proposed allows to calculate the profile of volume and mass distribution of the fat phase (fat globules in the product, depending on the homogenization pressure and can be used in the laboratory and research of dairy products composition, as well as in the calculation, design and modeling of the process equipment of the dairy industry enterprises.

  17. 7 CFR 58.636 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.636 Section 58.636 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.636 Homogenization. Homogenization of the pasteurized mix shall be accomplished to...

  18. Steady- and transient-state analysis of fully ceramic microencapsulated fuel with randomly dispersed tristructural isotropic particles via two-temperature homogenized model-I: Theory and method

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin

    2016-01-01

    As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM) fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC) matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1) matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2) preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1) they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2) they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained

  19. Homogenization of neutronic diffusion models; Homogeneisation des modeles de diffusion en neutronique

    Energy Technology Data Exchange (ETDEWEB)

    Capdebosq, Y

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  20. Homogenization of Stokes and Navier-Stokes equations

    International Nuclear Information System (INIS)

    Allaire, G.

    1990-04-01

    This thesis is devoted to homogenization of Stokes and Navier-Stokes equations with a Dirichlet boundary condition in a domain containing many tiny obstacles. Tipycally those obstacles are distributed at the modes of a periodic lattice with same small period in each axe's direction, and their size is always asymptotically smaller than the lattice's step. With the help of the energy method, and thanks to a suitable pressure's extension, we prove the convergence of the homogenization process when the lattice's step tends to zero (and thus the number of obstacles tends to infinity). For a so-called critical size of the obstacles, the homogenized problem turns out to be a Brinkman's law (i.e. Stokes or Navier-Stokes equation plus a linear zero-order term for the velocity in the momentum equation). For obstacles which have a size smaller than the critical one, the limit problem reduces to the initial Stokes or Navier-Stokes equations, while for larger sizes the homogenized problem a Darcy's law. Furthermore, those results have been extended to the case of obstacles included in a hyperplane, and we establish a simple model of fluid flows through grids, which is based on a special form of Brinkman's law [fr

  1. Value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations

    Directory of Open Access Journals (Sweden)

    Luo Li-Qin

    2016-01-01

    Full Text Available In this paper, we investigate the value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations, and obtain the results on the relations between the order of the solutions and the convergence exponents of the zeros, poles, a-points and small function value points of the solutions, which show the relations in the case of non-homogeneous equations are sharper than the ones in the case of homogeneous equations.

  2. A method to eliminate wetting during the homogenization of HgCdTe

    Science.gov (United States)

    Su, Ching-Hua; Lehoczky, S. L.; Szofran, F. R.

    1986-01-01

    Adhesion of HgCdTe samples to fused silica ampoule walls, or 'wetting', during the homogenization process was eliminated by adopting a slower heating rate. The idea is to decrease Cd activity in the sample so as to reduce the rate of reaction between Cd and the silica wall.

  3. Improvement of the field homogeneity with a permanent magnet assembly for MRI

    International Nuclear Information System (INIS)

    Sakurai, H.; Aoki, M.; Miyamoto, T.

    1990-01-01

    In the last few years, MRI (Magnetic Resonance imaging) has become one of the most excellent and important radiological and diagnostic methods. For this application, a strong and uniform magnetic field is required in the area where the patient is examined. This requirement for a high order of homogeneity is increasing with the rapid progress of tomographic technology. On the other hand, the cost reduction for the magnet is also strongly required. As reported in the last paper, we developed and mass-produced a permanent type magnet using high energy Nd-Fe-B material. This paper presents a newly developed 15 plane measuring method instead of a 7 plane method to evaluate the homogeneous field precisely. By using this analytical method and linear programing method, a new-shaped pole piece has been developed. In consequence, homogeneity was improved twice as much and the magnet weight was reduced 10 % as compared with the formerly developed pole piece. (author)

  4. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  5. Simulation of ferric ions transfer in dosemeter Fricke-Xylenol-Gel in means no homogeneous

    International Nuclear Information System (INIS)

    Milani, Caio J.; Bevilacqua, Joyce da Silva; Cavinato, Christianne C.; Rodrigues Junior, Orlando; Campos, Leticia L.

    2013-01-01

    Dosimetry in three dimensions using Fricke-Xilenol-Gel dosimeters (FXG) allows the confirmation and a better understanding of a treatment by Radiotherapy. The technique involves the assessment of the irradiated volumes by magnetic resonance imaging (MRI) or optical-CT. On both cases, the time elapsed between the irradiation and the measurement is an important factor in the quality of results. The quality of the images can be compromised by the mobility of the ferric ions (Fe 3+ ), formed during the the interaction of the radiation with the matter, increasing the uncertainty in the determination of the isodoses in the volume. In this work, the phenomenon of the diffusion of the ferric ions formed by an irradiated region is simulated in a bidimensional domain. The dynamic of the Fe 3+ in Fricke-Gel is modeled by a parabolic partial differential equation and solved by the ADI-Peaceman-Rachford algorithm. Stability and consistency of the method guarantee the convergence of the numerical solution for a pre-defined error magnitude, based on choices for the discretization values of time and space. Homogeneous and non-homogeneous cases are presented considering an irradiated region and a physical barrier that prevents the movement of the ions, on the non-homogeneous case. Graphical visualizations of the phenomenon are presented for better understanding of the process. (author)

  6. Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems

    Science.gov (United States)

    Leuschner, Matthias; Fritzen, Felix

    2017-11-01

    Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.

  7. Comparative Evaluation of Three Homogenization Methods for Isolating Middle East Respiratory Syndrome Coronavirus Nucleic Acids From Sputum Samples for Real-Time Reverse Transcription PCR.

    Science.gov (United States)

    Sung, Heungsup; Yong, Dongeun; Ki, Chang Seok; Kim, Jae Seok; Seong, Moon Woo; Lee, Hyukmin; Kim, Mi Na

    2016-09-01

    Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1-35.4 with the PK-DNase method, 34.7-39.0 with the PBS method, and 33.9-38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both Phomogenizing sputum samples prior to RNA extraction.

  8. Characterization of herbal powder blends homogeneity using near-infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    Wenlong Li

    2014-11-01

    Full Text Available Homogeneity of powder blend is essential to obtain uniform contents for the tablets and capsules. Near-infrared (NIR spectroscopy with fiber-optic probe was used as an on-line technique for monitoring the homogeneity of pharmaceutical blend during the blending process instead of the traditional techniques, such as high performance liquid chromatograph (HPLC method. In this paper NIRS with a SabIR diffuse reflectance fiber-optic probe was used to monitor the blending process of coptis powder and lactose (excipient with different contents, and further qualitative methods, like similarity, moving block of standard deviation and mean square were used for calculation purposes with the collected spectra after the pretreatment of multiplicative signal correction (MSC and second derivative. Correlation spectrum was used for the wavelength selection. Four different coptis were blended with lactose separately to validate the proposed method, and the blending process of "liu wei di huang" pill was also simulated in bottles to verify this method on multiple herbal blends. The overall results suggest that NIRS is a simple, effective and noninvasive technique can be successfully applied to the determination of homogeneity in the herbal blend.

  9. Rheological properties and physical stability of ecological emulsions stabilized by a surfactant derived from cocoa oil and high pressure homogenization

    Directory of Open Access Journals (Sweden)

    Trujillo-Cayado, L. A.

    2015-09-01

    Full Text Available The goal of this work was to investigate the influence of the emulsification method on the rheological properties, droplet size distribution and physical stability of O/W green emulsions formulated with an eco-friendly surfactant derived from cocoa oil. The methodology used can be applied to other emulsions. Polyoxyethylene glycerol esters are non-ionic surfactants obtained from a renewable source which fulfill the environmental and toxicological requirements to be used as eco-friendly emulsifying agents. In the same way, N,NDimethyloctanamide and α-Pinene (solvents used as oil phase could be considered green solvents. Emulsions with submicron mean diameters and slight shear thinning behavior were obtained regardless of the homogenizer, pressure or number of passes used. All emulsions exhibited destabilization by creaming and a further coalescence process which was applied to the coarse emulsion prepared with a rotor-stator homogenizer. The emulsion obtained with high pressure at 15000 psi and 1-pass was the most stable.El objetivo de este trabajo fue estudiar la influencia del método de emulsificación sobre las propiedades reológicas, la distribución de tamaños de gota y la estabilidad física de emulsiones verdes O/W formuladas con un tensioactivo derivado del aceite de coco respetuoso con el medioambiente. La metodología empleada puede ser aplicada a cualquier otro tipo de emulsiones. Los ésteres polietoxilados de glicerina son tensioactivos no iónicos obtenidos de fuentes renovables que cumplen requisitos medioambientales y toxicológicos para ser usados como agentes emulsionantes ecológicos. Del mismo modo, la N,N-dimetil octanamida y el α-Pineno (disolventes usados como fase oleosa pueden ser considerados como disolventes verdes. Se han obtenido emulsiones con diámetros medio submicrónicos y comportamiento ligeramente pseudoplástico independientemente del equipo, la presión o el número de pasadas empleados. Todas las

  10. Geostatistical Analysis Methods for Estimation of Environmental Data Homogeneity

    Directory of Open Access Journals (Sweden)

    Aleksandr Danilov

    2018-01-01

    Full Text Available The methodology for assessing the spatial homogeneity of ecosystems with the possibility of subsequent zoning of territories in terms of the degree of disturbance of the environment is considered in the study. The degree of pollution of the water body was reconstructed on the basis of hydrochemical monitoring data and information on the level of the technogenic load in one year. As a result, the greatest environmental stress zones were isolated and correct zoning using geostatistical analysis techniques was proved. Mathematical algorithm computing system was implemented in an object-oriented programming C #. A software application has been obtained that allows quickly assessing the scale and spatial localization of pollution during the initial analysis of the environmental situation.

  11. Homogeneity study of fixed-point continuous marine environmental and meteorological data: a review

    Science.gov (United States)

    Yang, Jinkun; Yang, Yang; Miao, Qingsheng; Dong, Mingmei; Wan, Fangfang

    2018-02-01

    The principle of inhomogeneity and the classification of homogeneity test methods are briefly described, and several common inhomogeneity methods and relative merits are described in detail. Then based on the applications of the different homogeneity methods to the ground meteorological data and marine environment data, the present status and the progress are reviewed. At present, the homogeneity research of radiosonde and ground meteorological data is mature at home and abroad, and the research and application in the marine environmental data should also be given full attention. To carry out a variety of test and correction methods combined with the use of multi-mode test system, will make the results more reasonable and scientific, and also can be used to provide accurate first-hand information for the coastal climate change researches.

  12. Homogeneity spoil spectroscopy

    International Nuclear Information System (INIS)

    Hennig, J.; Boesch, C.; Martin, E.; Grutter, R.

    1987-01-01

    One of the problems of in vivo MR spectroscopy of P-31 is spectra localization. Surface coil spectroscopy, which is the method of choice for clinical applications, suffers from the high-intensity signal from subcutaneous muscle tissue, which masks the spectrum of interest from deeper structures. In order to suppress this signal while maintaining the simplicity of surface coil spectroscopy, the authors introduced a small sheet of ferromagnetically dotted plastic between the surface coil and the body. This sheet destroys locally the field homogeneity and therefore all signal from structures around the coil. The very high reproducibility of the simple experimental procedure allows long-term studies important for monitoring tumor therapy

  13. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    KAUST Repository

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; Fu, Shubin; Efendiev, Yalchin R.

    2015-01-01

    The development of reliable methods for upscaling fine-scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters

  14. Geometric Methods in Physics XXXV

    CERN Document Server

    Odzijewicz, Anatol; Previato, Emma

    2018-01-01

    This book features a selection of articles based on the XXXV Białowieża Workshop on Geometric Methods in Physics, 2016. The series of Białowieża workshops, attended by a community of experts at the crossroads of mathematics and physics, is a major annual event in the field. The works in this book, based on presentations given at the workshop, are previously unpublished, at the cutting edge of current research, typically grounded in geometry and analysis, and with applications to classical and quantum physics. In 2016 the special session "Integrability and Geometry" in particular attracted pioneers and leading specialists in the field. Traditionally, the Białowieża Workshop is followed by a School on Geometry and Physics, for advanced graduate students and early-career researchers, and the book also includes extended abstracts of the lecture series.

  15. FEATURES OF METHODS OF FUTURE PHYSICAL CULTURE TEACHERS’ TRAINING FOR PHYSICAL EDUCATION OF HIGH SCHOOL STUDENTS

    Directory of Open Access Journals (Sweden)

    Петро Джуринський

    2014-12-01

    Full Text Available The paper presents the methodical approaches and recommendations on implementation of methods of future Physical Culture teachers to physical education of high school students into study process at a higher educational institution. The role of the approbated study discipline “Theory and methods of physical education at high school” has been determined in this research. It has also been defined, that future Physical Culture teacher’s training for physical education of high school students is a system of organizational and educational measures, ensuring the formation of future teacher’s professional knowledge and skills. The article presents the defined tasks, criteria, tools, forms, pedagogical conditions and stages of students’ training for teaching classes of Physical Education to high school students. Approbation of methodical approaches to future Physical Culture teachers’ training for physical education of high school students demonstrated their efficacy

  16. The nonlinear Galerkin method: A multi-scale method applied to the simulation of homogeneous turbulent flows

    Science.gov (United States)

    Debussche, A.; Dubois, T.; Temam, R.

    1993-01-01

    Using results of Direct Numerical Simulation (DNS) in the case of two-dimensional homogeneous isotropic flows, the behavior of the small and large scales of Kolmogorov like flows at moderate Reynolds numbers are first analyzed in detail. Several estimates on the time variations of the small eddies and the nonlinear interaction terms were derived; those terms play the role of the Reynolds stress tensor in the case of LES. Since the time step of a numerical scheme is determined as a function of the energy-containing eddies of the flow, the variations of the small scales and of the nonlinear interaction terms over one iteration can become negligible by comparison with the accuracy of the computation. Based on this remark, a multilevel scheme which treats differently the small and the large eddies was proposed. Using mathematical developments, estimates of all the parameters involved in the algorithm, which then becomes a completely self-adaptive procedure were derived. Finally, realistic simulations of (Kolmorov like) flows over several eddy-turnover times were performed. The results are analyzed in detail and a parametric study of the nonlinear Galerkin method is performed.

  17. The Homogeneous Interior-Point Algorithm: Nonsymmetric Cones, Warmstarting, and Applications

    DEFF Research Database (Denmark)

    Skajaa, Anders

    algorithms for these problems is still limited. The goal of this thesis is to investigate and shed light on two computational aspects of homogeneous interior-point algorithms for convex conic optimization: The first part studies the possibility of devising a homogeneous interior-point method aimed at solving...... problems involving constraints that require nonsymmetric cones in their formulation. The second part studies the possibility of warmstarting the homogeneous interior-point algorithm for conic problems. The main outcome of the first part is the introduction of a completely new homogeneous interior......-point algorithm designed to solve nonsymmetric convex conic optimization problems. The algorithm is presented in detail and then analyzed. We prove its convergence and complexity. From a theoretical viewpoint, it is fully competitive with other algorithms and from a practical viewpoint, we show that it holds lots...

  18. Synthesis of focused beam with controllable arbitrary homogeneous polarization using engineered vectorial optical fields.

    Science.gov (United States)

    Rui, Guanghao; Chen, Jian; Wang, Xiaoyan; Gu, Bing; Cui, Yiping; Zhan, Qiwen

    2016-10-17

    The propagation and focusing properties of light beams continue to remain a research interest owning to their promising applications in physics, chemistry and biological sciences. One of the main challenges to these applications is the control of polarization distribution within the focal volume. In this work, we propose and experimentally demonstrate a method for generating a focused beam with arbitrary homogeneous polarization at any transverse plane. The required input field at the pupil plane of a high numerical aperture objective lens can be found analytically by solving an inverse problem with the Richard-Wolf vectorial diffraction method, and can be experimentally created with a vectorial optical field generator. Focused fields with various polarizations are successfully generated and verified using a Stokes parameter measurement to demonstrate the capability and versatility of proposed technique.

  19. Reflector homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr

    2004-07-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)

  20. Reflector homogenization

    International Nuclear Information System (INIS)

    Sanchez, R.; Ragusa, J.; Santandrea, S.

    2004-01-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P 0 transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP N core calculations. (Author)

  1. Homogeneous CdTe quantum dots-carbon nanotubes heterostructures

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Kayo Oliveira [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Bettini, Jefferson [Laboratório Nacional de Nanotecnologia, Centro Nacional de Pesquisa em Energia e Materiais, CEP 13083-970, Campinas, SP (Brazil); Ferrari, Jefferson Luis [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Schiavon, Marco Antonio, E-mail: schiavon@ufsj.edu.br [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil)

    2015-01-15

    The development of homogeneous CdTe quantum dots-carbon nanotubes heterostructures based on electrostatic interactions has been investigated. We report a simple and reproducible non-covalent functionalization route that can be accomplished at room temperature, to prepare colloidal composites consisting of CdTe nanocrystals deposited onto multi-walled carbon nanotubes (MWCNTs) functionalized with a thin layer of polyelectrolytes by layer-by-layer technique. Specifically, physical adsorption of polyelectrolytes such as poly (4-styrene sulfonate) and poly (diallyldimethylammonium chloride) was used to deagglomerate and disperse MWCNTs, onto which we deposited CdTe quantum dots coated with mercaptopropionic acid (MPA), as surface ligand, via electrostatic interactions. Confirmation of the CdTe quantum dots/carbon nanotubes heterostructures was done by transmission and scanning electron microscopies (TEM and SEM), dynamic-light scattering (DLS) together with absorption, emission, Raman and infrared spectroscopies (UV–vis, PL, Raman and FT-IR). Almost complete quenching of the PL band of the CdTe quantum dots was observed after adsorption on the MWCNTs, presumably through efficient energy transfer process from photoexcited CdTe to MWCNTs. - Highlights: • Highly homogeneous CdTe-carbon nanotubes heterostructures were prepared. • Simple and reproducible non-covalent functionalization route. • CdTe nanocrystals homogeneously deposited onto multi-walled carbon nanotubes. • Efficient energy transfer process from photoexcited CdTe to MWCNTs.

  2. Steady- and Transient-State Analyses of Fully Ceramic Microencapsulated Fuel with Randomly Dispersed Tristructural Isotropic Particles via Two-Temperature Homogenized Model—I: Theory and Method

    Directory of Open Access Journals (Sweden)

    Yoonhee Lee

    2016-06-01

    Full Text Available As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1 matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2 preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1 they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2 they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained.

  3. Comparison of different homogenization approaches for elastic–viscoplastic materials

    International Nuclear Information System (INIS)

    Mercier, S; Molinari, A; Berbenni, S; Berveiller, M

    2012-01-01

    Homogenization of linear viscoelastic and non-linear viscoplastic composite materials is considered in this paper. First, we compare two homogenization schemes based on the Mori–Tanaka method coupled with the additive interaction (AI) law proposed by Molinari et al (1997 Mech. Mater. 26 43–62) or coupled with a concentration law based on translated fields (TF) originally proposed for the self-consistent scheme by Paquin et al (1999 Arch. Appl. Mech. 69 14–35). These methods are also evaluated against (i) full-field calculations of the literature based on the finite element method and on fast Fourier transform, (ii) available analytical exact solutions obtained in linear viscoelasticity and (iii) homogenization methods based on variational approaches. Developments of the AI model are obtained for linear and non-linear material responses while results for the TF method are shown for the linear case. Various configurations are considered: spherical inclusions, aligned fibers, hard and soft inclusions, large material contrasts between phases, volume-preserving versus dilatant anelastic flow, non-monotonic loading. The agreement between the AI and TF methods is excellent and the correlation with full field calculations is in general of quite good quality (with some exceptions for non-linear composites with a large volume fraction of very soft inclusions for which a discrepancy of about 15% was found for macroscopic stress). Description of the material behavior with internal variables can be accounted for with the AI and TF approaches and therefore complex loadings can be easily handled in contrast with most hereditary approaches. (paper)

  4. Unfolding methods in high-energy physics experiments

    International Nuclear Information System (INIS)

    Blobel, V.

    1985-01-01

    Distributions measured in high-energy physics experiments are often distorted or transformed by limited acceptance and finite resolution of the detectors. The unfolding of measured distributions is an important, but due to inherent instabilities a very difficult problem. Methods for unfolding, applicable for the analysis of high-energy physics experiments, and their properties are discussed. An introduction is given to the method of regularization. (orig.)

  5. Unfolding methods in high-energy physics experiments

    International Nuclear Information System (INIS)

    Blobel, V.

    1984-12-01

    Distributions measured in high-energy physics experiments are often distorted or transformed by limited acceptance and finite resolution of the detectors. The unfolding of measured distributions is an important, but due to inherent instabilities a very difficult problem. Methods for unfolding, applicable for the analysis of high-energy physics experiments, and their properties are discussed. An introduction is given to the method of regularization. (orig.)

  6. Industrial applications of neutron physics methods

    International Nuclear Information System (INIS)

    Gozani, T.

    1994-01-01

    Three areas where nuclear based techniques have significant are briefly described. These are: Nuclear material control and non-proliferation, on-line elemental analysis of coal and minerals, and non- detection of explosives and other contraband. The nuclear physics and the role of reactor physics methods are highlighted. (author). 5 refs., 10 figs., 5 tabs

  7. Homogenization of linearly anisotropic scattering cross sections in a consistent B1 heterogeneous leakage model

    International Nuclear Information System (INIS)

    Marleau, G.; Debos, E.

    1998-01-01

    One of the main problems encountered in cell calculations is that of spatial homogenization where one associates to an heterogeneous cell an homogeneous set of cross sections. The homogenization process is in fact trivial when a totally reflected cell without leakage is fully homogenized since it involved only a flux-volume weighting of the isotropic cross sections. When anisotropic leakages models are considered, in addition to homogenizing isotropic cross sections, the anisotropic scattering cross section must also be considered. The simple option, which consists of using the same homogenization procedure for both the isotropic and anisotropic components of the scattering cross section, leads to inconsistencies between the homogeneous and homogenized transport equation. Here we will present a method for homogenizing the anisotropic scattering cross sections that will resolve these inconsistencies. (author)

  8. Monte Carlo dose calculations in homogeneous media and at interfaces: a comparison between GEPTS, EGSnrc, MCNP, and measurements.

    Science.gov (United States)

    Chibani, Omar; Li, X Allen

    2002-05-01

    Three Monte Carlo photon/electron transport codes (GEPTS, EGSnrc, and MCNP) are bench-marked against dose measurements in homogeneous (both low- and high-Z) media as well as at interfaces. A brief overview on physical models used by each code for photon and electron (positron) transport is given. Absolute calorimetric dose measurements for 0.5 and 1 MeV electron beams incident on homogeneous and multilayer media are compared with the predictions of the three codes. Comparison with dose measurements in two-layer media exposed to a 60Co gamma source is also performed. In addition, comparisons between the codes (including the EGS4 code) are done for (a) 0.05 to 10 MeV electron beams and positron point sources in lead, (b) high-energy photons (10 and 20 MeV) irradiating a multilayer phantom (water/steel/air), and (c) simulation of a 90Sr/90Y brachytherapy source. A good agreement is observed between the calorimetric electron dose measurements and predictions of GEPTS and EGSnrc in both homogeneous and multilayer media. MCNP outputs are found to be dependent on the energy-indexing method (Default/ITS style). This dependence is significant in homogeneous media as well as at interfaces. MCNP(ITS) fits more closely the experimental data than MCNP(DEF), except for the case of Be. At low energy (0.05 and 0.1 MeV), MCNP(ITS) dose distributions in lead show higher maximums in comparison with GEPTS and EGSnrc. EGS4 produces too penetrating electron-dose distributions in high-Z media, especially at low energy (MCNP results depend significantly on the electron energy-indexing method.

  9. Constitutive modeling of two phase materials using the Mean Field method for homogenization

    NARCIS (Netherlands)

    Perdahcioglu, Emin Semih; Geijselaers, Hubertus J.M.

    2010-01-01

    A Mean-Field homogenization framework for constitutive modeling of materials involving two distinct elastic-plastic phases is presented. With this approach it is possible to compute the macroscopic mechanical behavior of this type of materials based on the constitutive models of the constituent

  10. Neutron transport equation - indications on homogenization and neutron diffusion

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1992-06-01

    In PWR nuclear reactor, the practical study of the neutrons in the core uses diffusion equation to describe the problem. On the other hand, the most correct method to describe these neutrons is to use the Boltzmann equation, or neutron transport equation. In this paper, we give some theoretical indications to obtain a diffusion equation from the general transport equation, with some simplifying hypothesis. The work is organised as follows: (a) the most general formulations of the transport equation are presented: integro-differential equation and integral equation; (b) the theoretical approximation of this Boltzmann equation by a diffusion equation is introduced, by the way of asymptotic developments; (c) practical homogenization methods of transport equation is then presented. In particular, the relationships with some general and useful methods in neutronic are shown, and some homogenization methods in energy and space are indicated. A lot of other points of view or complements are detailed in the text or the remarks

  11. Physics and my method

    CERN Document Server

    Feldenkrais, Moshé

    1981-01-01

    Moshe Feldenkrais is known from the textbooks as a collaborator of Joliot-Curie, Langevin, and Kowarski participating in the first nuclear fission experiments. During the war he went to Great Britain and worked on the development of submarine detection devices. From experimental physics, following finally a suggestion of Lew Kowarski, he turned his interest to neurophysiology and neuropsychology. He studied the cybernetical organisation between human body dynamics and the mind. He developed his method known as "Functional integration" and "Awareness through movement". It has been applied with surprising results to post-traumatic rehabilitation, psychotherapy, re-education of the mentally or physically handicapped, and improvement of performance in sports. It can be used by everybody who wants to discover his natural grace of movement.

  12. A Correlated Random Effects Model for Non-homogeneous Markov Processes with Nonignorable Missingness.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2013-05-01

    Life history data arising in clusters with prespecified assessment time points for patients often feature incomplete data since patients may choose to visit the clinic based on their needs. Markov process models provide a useful tool describing disease progression for life history data. The literature mainly focuses on time homogeneous process. In this paper we develop methods to deal with non-homogeneous Markov process with incomplete clustered life history data. A correlated random effects model is developed to deal with the nonignorable missingness, and a time transformation is employed to address the non-homogeneity in the transition model. Maximum likelihood estimate based on the Monte-Carlo EM algorithm is advocated for parameter estimation. Simulation studies demonstrate that the proposed method works well in many situations. We also apply this method to an Alzheimer's disease study.

  13. Bilipschitz embedding of homogeneous fractals

    OpenAIRE

    Lü, Fan; Lou, Man-Li; Wen, Zhi-Ying; Xi, Li-Feng

    2014-01-01

    In this paper, we introduce a class of fractals named homogeneous sets based on some measure versions of homogeneity, uniform perfectness and doubling. This fractal class includes all Ahlfors-David regular sets, but most of them are irregular in the sense that they may have different Hausdorff dimensions and packing dimensions. Using Moran sets as main tool, we study the dimensions, bilipschitz embedding and quasi-Lipschitz equivalence of homogeneous fractals.

  14. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    International Nuclear Information System (INIS)

    Moutsopoulos, George

    2013-01-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre–Petrov types and discuss the warped de Sitter spacetime. (paper)

  15. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    Science.gov (United States)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  16. Homogenate extraction of crude polysaccharides from Lentinus edodes and evaluation of the antioxidant activity

    Directory of Open Access Journals (Sweden)

    Leqin KE

    2016-01-01

    Full Text Available Abstract Crude polysaccharides of Lentinus edodes were extracted using homogenate method. Factors affecting the yield of crude polysaccharides were investigated and optimized by response surface methodology. The homogenate extraction method was compared with traditional heating extraction method. The antioxidant activity of crude polysaccharides from Lentinus edodes was evaluated. Results showed that, the optimal conditions of homogenate extraction were as follows: solvent pH, 10; liquid-solid ratio, 30: 1 (mL: g, extraction time, 66 s; number of extraction, 1. Under these conditions, the yield of crude polysaccharides was (13.2 ± 0.9%, which was 29.82% higher than that of traditional heating extraction. Crude polysaccharides of Lentinus edodes had good DPPH scavenging activity. Compared with the traditional heating extraction, the homogenate extraction had notable advantages including good extraction yield, short extraction time and low extraction temperature. It is an efficient way to extract crude polysaccharides from Lentinus edodes.

  17. Novel method to ascertain chromatin accessibility at specific genomic loci from frozen brain homogenates and laser capture microdissected defined cells.

    Science.gov (United States)

    Delvaux, Elaine; Mastroeni, Diego; Nolz, Jennifer; Coleman, Paul D

    2016-06-01

    We describe a novel method for assessing the "open" or "closed" state of chromatin at selected locations within the genome. This method combines the use of Benzonase, which can digest DNA in the presence of actin, with qPCR to define digested regions. We demonstrate the application of this method in brain homogenates and laser captured cells. We also demonstrate application to selected sites within more than one gene and multiple sites within one gene. We demonstrate the validity of the method by treating cells with valproate, known to render chromatin more permissive, and by comparison with classical digestion with DNase I in an in vitro preparation. Although we demonstrate the use of this method in brain tissue we also recognize its applicability to other tissue types.

  18. Evaluation of methods to assess physical activity

    Science.gov (United States)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  19. Design of SC solenoid with high homogeneity

    International Nuclear Information System (INIS)

    Yang Xiaoliang; Liu Zhong; Luo Min; Luo Guangyao; Kang Qiang; Tan Jie; Wu Wei

    2014-01-01

    A novel kind of SC (superconducting) solenoid coil is designed to satisfy the homogeneity requirement of the magnetic field. In this paper, we first calculate the current density distribution of the solenoid coil section through the linear programming method. Then a traditional solenoid and a nonrectangular section solenoid are designed to produce a central field up to 7 T with a homogeneity to the greatest extent. After comparison of the two solenoid coils designed in magnet field quality, fabrication cost and other aspects, the new design of the nonrectangular section of a solenoid coil can be realized through improving the techniques of framework fabrication and winding. Finally, the outlook and error analysis of this kind of SC magnet coil are also discussed briefly. (authors)

  20. Shape optimization in biomimetics by homogenization modelling

    International Nuclear Information System (INIS)

    Hoppe, Ronald H.W.; Petrova, Svetozara I.

    2003-08-01

    Optimal shape design of microstructured materials has recently attracted a great deal of attention in material science. The shape and the topology of the microstructure have a significant impact on the macroscopic properties. The present work is devoted to the shape optimization of new biomorphic microcellular ceramics produced from natural wood by biotemplating. We are interested in finding the best material-and-shape combination in order to achieve the optimal prespecified performance of the composite material. The computation of the effective material properties is carried out using the homogenization method. Adaptive mesh-refinement technique based on the computation of recovered stresses is applied in the microstructure to find the homogenized elasticity coefficients. Numerical results show the reliability of the implemented a posteriori error estimator. (author)

  1. On the infimum of the energy-momentum spectrum of a homogeneous Bose gas

    DEFF Research Database (Denmark)

    Cornean, Horia; Derezinski, J.; Zin, P.

    2009-01-01

    We consider second-quantized homogeneous Bose gas in a large cubic box with periodic boundary conditions at zero temperature. We discuss the energy-momentum spectrum of the Bose gas and its physical significance. We review various rigorous and heuristic results as well as open conjectures about its...

  2. Homogenous Nucleation Rates of n-Propanol Measured in the Laminar Flow Diffusion Chamber at Different Total Pressures.

    Czech Academy of Sciences Publication Activity Database

    Neitola, K.; Hyvärinen, A.-P.; Lihavainen, H.; Wölk, J.; Strey, R.; Brus, David

    2014-01-01

    Roč. 140, č. 17 (2014), č. článku 174301. ISSN 0021-9606 Grant - others:AFCE(FI) 1118615 Institutional support: RVO:67985858 Keywords : carbon-chain length * equilibrium vapor * homogenous nucleation Subject RIV: CF - Physical ; Theoretical Chemistry OBOR OECD: Physical chemistry Impact factor: 2.952, year: 2014

  3. How to determine composite material properties using numerical homogenization

    DEFF Research Database (Denmark)

    Andreassen, Erik; Andreasen, Casper Schousboe

    2014-01-01

    Numerical homogenization is an efficient way to determine effective macroscopic properties, such as the elasticity tensor, of a periodic composite material. In this paper an educational description of the method is provided based on a short, self-contained Matlab implementation. It is shown how...... the basic code, which computes the effective elasticity tensor of a two material composite, where one material could be void, is easily extended to include more materials. Furthermore, extensions to homogenization of conductivity, thermal expansion, and fluid permeability are described in detail. The unit...

  4. Geometric Methods in Physics : XXXIII Workshop

    CERN Document Server

    Bieliavsky, Pierre; Odzijewicz, Anatol; Schlichenmaier, Martin; Voronov, Theodore

    2015-01-01

    This book presents a selection of papers based on the XXXIII Białowieża Workshop on Geometric Methods in Physics, 2014. The Białowieża Workshops are among the most important meetings in the field and attract researchers from both mathematics and physics. The articles gathered here are mathematically rigorous and have important physical implications, addressing the application of geometry in classical and quantum physics. Despite their long tradition, the workshops remain at the cutting edge of ongoing research. For the last several years, each Białowieża Workshop has been followed by a School on Geometry and Physics, where advanced lectures for graduate students and young researchers are presented; some of the lectures are reproduced here. The unique atmosphere of the workshop and school is enhanced by its venue, framed by the natural beauty of the Białowieża forest in eastern Poland. The volume will be of interest to researchers and graduate students in mathematical physics, theoretical physics and m...

  5. Type of homogenization and fat loss during continuous infusion of human milk.

    Science.gov (United States)

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P homogenization is used. © The Author(s) 2014.

  6. Use of focused acoustics for cell disruption to provide ultra scale-down insights of microbial homogenization and its bioprocess impact--recovery of antibody fragments from rec E. coli.

    Science.gov (United States)

    Li, Qiang; Aucamp, Jean P; Tang, Alison; Chatel, Alex; Hoare, Mike

    2012-08-01

    An ultra scale-down (USD) device that provides insight of how industrial homogenization impacts bioprocess performance is desirable in the biopharmaceutical industry, especially at the early stage of process development where only a small quantity of material is available. In this work, we assess the effectiveness of focused acoustics as the basis of an USD cell disruption method to mimic and study high-pressure, step-wise homogenization of rec Escherichia coli cells for the recovery of an intracellular protein, antibody fragment (Fab'). The release of both Fab' and of overall protein follows first-order reaction kinetics with respect to time of exposure to focused acoustics. The rate constant is directly proportional to applied electrical power input per unit volume. For nearly total protein or Fab' release (>99%), the key physical properties of the disruptate produced by focused acoustics, such as cell debris particle size distribution and apparent viscosity show good agreement with those for homogenates produced by high-pressure homogenization operated to give the same fractional release. The only key difference is observed for partial disruption of cells where focused acoustics yields a disruptate of lower viscosity than homogenization, evidently due to a greater extent of polynucleic acids degradation. Verification of this USD approach to cell disruption by high-pressure homogenization is achieved using USD centrifugation to demonstrate the same sedimentation characteristics of disruptates prepared using both the scaled-down focused acoustic and the pilot-scale homogenization methods for the same fraction of protein release. Copyright © 2012 Wiley Periodicals, Inc.

  7. Subspace identification of distributed clusters of homogeneous systems

    NARCIS (Netherlands)

    Yu, C.; Verhaegen, M.H.G.

    2017-01-01

    This note studies the identification of a network comprised of interconnected clusters of LTI systems. Each cluster consists of homogeneous dynamical systems, and its interconnections with the rest of the network are unmeasurable. A subspace identification method is proposed for identifying a single

  8. Determination of trace elements in dried sea-plant homogenate (SP-M-1) and in dried copepod homogenate (MA-A-1) by means of neutron activation analysis

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Bruin, M. de.

    1977-07-01

    Two marine environmental reference materials were investigated in an intercalibration study of trace elements. According to the specifications from the International Laboratory of Marine Radioactivity at Monaco two samples, a sea-plant homogenate and a copepod homogenate, were analysed by neutron activation analysis. The results of the trace-element analyses were based on dry weight. The moisture content was measured on separate aliquots. For the intercalibration study the following elements were listed as elements of primary interest: mercury, cadmium, lead, manganese, zinc, copper, chromium, silver, iron and nickel. For the 14 elements normally analysed with the routine destructive method, the element gold could not be measured in the two marine samples. This was due to the high residual bromine-82 activity in fraction 13, which contains gold that distills over. With the nondestructive method, nine elements could be assessed, of which only three coincide with the 14 elements of the destructive method. A survey of all measured (trace) elements is presented. The 20 (trace) elements measured in the sea-plant homogenate and in the copepod homogenate comprise 8 out of the 10 trace elements of primary interest, all 5 trace elements of secondary interest (arsenic, cobalt, antimony, selenium and vanadium), and 5 additional (trace) elements. The trace-element determination in both marine biological materials via the destructive procedure was carried out in twelve-fold. In the nondestructive procedure quadruple measurements were performed. For all trace-element levels analysed an average value was calculated

  9. Smooth homogeneous structures in operator theory

    CERN Document Server

    Beltita, Daniel

    2005-01-01

    Geometric ideas and techniques play an important role in operator theory and the theory of operator algebras. Smooth Homogeneous Structures in Operator Theory builds the background needed to understand this circle of ideas and reports on recent developments in this fruitful field of research. Requiring only a moderate familiarity with functional analysis and general topology, the author begins with an introduction to infinite dimensional Lie theory with emphasis on the relationship between Lie groups and Lie algebras. A detailed examination of smooth homogeneous spaces follows. This study is illustrated by familiar examples from operator theory and develops methods that allow endowing such spaces with structures of complex manifolds. The final section of the book explores equivariant monotone operators and Kähler structures. It examines certain symmetry properties of abstract reproducing kernels and arrives at a very general version of the construction of restricted Grassmann manifolds from the theory of loo...

  10. Application of cybernetic methods in physics

    Energy Technology Data Exchange (ETDEWEB)

    Fradkov, Aleksandr L [Institute of Problems of Mechanical Engineering, Russian Academy of Sciences, St.-Petersburg (Russian Federation)

    2005-02-28

    Basic aspects of the subject and methodology for a new and rapidly developing area of research that has emerged at the intersection of physics and control theory (cybernetics) and emphasizes the application of cybernetic methods to the study of physical systems are reviewed. Speed-gradient and Hamiltonian solutions for energy control problems in conservative and dissipative systems are presented. Application examples such as the Kapitza pendulum, controlled overcoming of a potential barrier, and controlling coupled oscillators and molecular systems are presented. A speed-gradient approach to modeling the dynamics of physical systems is discussed. (reviews of topical problems)

  11. Application of cybernetic methods in physics

    International Nuclear Information System (INIS)

    Fradkov, Aleksandr L

    2005-01-01

    Basic aspects of the subject and methodology for a new and rapidly developing area of research that has emerged at the intersection of physics and control theory (cybernetics) and emphasizes the application of cybernetic methods to the study of physical systems are reviewed. Speed-gradient and Hamiltonian solutions for energy control problems in conservative and dissipative systems are presented. Application examples such as the Kapitza pendulum, controlled overcoming of a potential barrier, and controlling coupled oscillators and molecular systems are presented. A speed-gradient approach to modeling the dynamics of physical systems is discussed. (reviews of topical problems)

  12. Central Andean temperature and precipitation measurements and its homogenization

    Science.gov (United States)

    Hunziker, Stefan; Gubler, Stefanie

    2015-04-01

    Observation of climatological parameters and the homogenization of these time series have a well-established history in western countries. This is not the case for many other countries, such as Bolivia and Peru. In Bolivia and Peru, the organization of measurements, quality of measurement equipment, equipment maintenance, training of staff and data management are fundamentally different compared to the western standard. The data needs special attention, because many problems are not detected by standard quality control procedures. Information about the weather stations, best achieved by station visits, is very beneficial. If the cause of the problem is known, some of the data may be corrected. In this study, cases of typical problems and measurement errors will be demonstrated. Much of research on homogenization techniques (up to subdaily scale) has been completed in recent years. However, data sets of the quality of western station networks have been used, and little is known about the performance of homogenization methods on data sets from countries such as Bolivia and Peru. HOMER (HOMogenizaton softwarE in R) is one of the most recent and widely used homogenization softwares. Its performance is tested on Peruvian-like data that has been sourced from Swiss stations (similar station density and metadata availability). The Swiss station network is a suitable test bed, because climate gradients are strong and the terrain is complex, as is also found in the Central Andes. On the other hand, the Swiss station network is dense, and long time series and extensive metadata are available. By subsampling the station network and omitting the metadata, the conditions of a Peruvian test region are mimicked. Results are compared to a dataset homogenized by THOMAS (Tool for Homogenization of Monthly Data Series), the homogenization tool used by MeteoSwiss.

  13. Experimental investigation of homogeneous freezing of sulphuric acid particles in the aerosol chamber AIDA

    Directory of Open Access Journals (Sweden)

    O. Möhler

    2003-01-01

    Full Text Available The homogeneous freezing of supercooled H2SO4/H2O solution droplets was investigated in the aerosol chamber AIDA (Aerosol Interactions and Dynamics in the Atmosphere of Forschungszentrum Karlsruhe. 24 freezing experiments were performed at temperatures between 189 and 235 K with aerosol particles in the diameter range 0.05 to 1 µm. Individual experiments started at homogeneous temperatures and ice saturation ratios between 0.9 and 0.95. Cloud cooling rates up to -2.8 K min-1 were simulated dynamically in the chamber by expansion cooling using a mechanical pump. Depending on the cooling rate and starting temperature, freezing threshold relative humidities were exceeded after expansion time periods between about 1 and 10 min. The onset of ice formation was measured with three independent methods showing good agreement among each other. Ice saturation ratios measured at the onset of ice formation increased from about 1.4 at 231 K  to about 1.75 at 189 K. The experimental data set including thermodynamic parameters as well as physical and chemical aerosol analysis provides a good basis for microphysical model applications.

  14. Homogenization of resonant chiral metamaterials

    OpenAIRE

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten; Malureanu, Radu; Lederer, Falk; Lavrinenko, Andrei

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as e.g. propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size a critical density exists above which increasing coupling between neighboring meta-atoms prevails a reasonable homogenization. On the contrary, a dilution in excess will induce features reminiscent to pho...

  15. Ultra-High Pressure Homogenization enhances physicochemical properties of soy protein isolate-stabilized emulsions.

    Science.gov (United States)

    Fernández-Ávila, C; Escriu, R; Trujillo, A J

    2015-09-01

    The effect of Ultra-High Pressure Homogenization (UHPH, 100-300MPa) on the physicochemical properties of oil-in-water emulsions prepared with 4.0% (w/v) of soy protein isolate (SPI) and soybean oil (10 and 20%, v/v) was studied and compared to emulsions treated by conventional homogenization (CH, 15MPa). CH emulsions were prepared with non-heated and heated (95°C for 15min) SPI dispersions. Emulsions were characterized by particle size determination with laser diffraction, rheological properties using a rotational rheometer by applying measurements of flow curve and by transmission electron microscopy. The variation on particle size and creaming was assessed by Turbiscan® analysis, and visual observation of the emulsions was also carried out. UHPH emulsions showed much smaller d 3.2 values and greater physical stability than CH emulsions. The thermal treatment of SPI prior CH process did not improve physical stability properties. In addition, emulsions containing 20% of oil exhibited greater physical stability compared to emulsions containing 10% of oil. Particularly, UHPH emulsions treated at 100 and 200MPa with 20% of oil were the most stable due to low particle size values (d 3.2 and Span), greater viscosity and partial protein denaturation. These results address the physical stability improvement of protein isolate-stabilized emulsions by using the emerging UHPH technology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    Science.gov (United States)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  17. Using the Case Study Method in Teaching College Physics

    Science.gov (United States)

    Burko, Lior M.

    2016-10-01

    The case study teaching method has a long history (starting at least with Socrates) and wide current use in business schools, medical schools, law schools, and a variety of other disciplines. However, relatively little use is made of it in the physical sciences, specifically in physics or astronomy. The case study method should be considered by physics faculty as part of the effort to transition the teaching of college physics from the traditional frontal-lecture format to other formats that enhance active student participation. In this paper we endeavor to interest physics instructors in the case study method, and hope that it would also serve as a call for more instructors to produce cases that they use in their own classes and that can also be adopted by other instructors.

  18. Pengaruh Kecepatan Homegenisasi Terhadap Sifat Fisika dan Kimia Krim Nanopartikel dengan Metode High Speed Homogenization (HSH

    Directory of Open Access Journals (Sweden)

    Galuh Suprobo

    2015-06-01

    Full Text Available Nanoparticle cream is the development of nanotechnology in cosmetics fields for improving the function of cream. High speed homogenization (HSH is one of the methods for creating nanoparticle cream. In this research, the use of natural materials based palm oil derivative  such as stearic acid, cetil alcohol, cetil stearil alcohol was chosen in nanoparticle cream producing by using HSH methods.The speed variable of  homogenization of 1000 rpm, 1500 rpm, 2,000 rpm and 2,500 rpm  intended to find out the influence of speed toward the  properties of cream product. The observation result showed the influence on physical display in term of texture but not in homogeneity , stability and cream color. The pH of the product during two months storage for all variables were still stable. The particle size was increased in the homogeneity of speed at 2000 rpm and 2500 rpm. In this research has produced the cream in particle size from 239.86 to 358.10 nm which enter in nanoparticle category 50 nm to 1000 nm. The stability of nanoparticle cream product in the range of 97,20 to 98%.ABSTRAKKrim nanopartikel merupakan pengembangan nanoteknologi di bidang kosmetik untuk meningkatkan fungsi krim tersebut. High speed homogenization (HSH merupakan salah satu metoda dalam pembuatan krim nanopartikel. Pada penelitian ini, krim nanopartikel dibuat menggunakan bahan baku alami turunan kelapa sawit yaitu asam stearat, setil alkohol, setil stearil alkohol dengan metoda HSH. Variabel kecepatan homogenisasi pada 1000 rpm, 1500 rpm, 2000 rpm dan 2500 rpm dimaksudkan untuk mengetahui pengaruh kecepatan terhadap sifat-sifat krim. Hasil menunjukkan bahwa perubahan kecepatan homogenisasi dalam reaktor berpengaruh terhadap tampilan fisik dari segi tekstur, akan tetapi tidak mempengaruhi terhadap kehomogenan, stabilitas dan warna krim. Dari pengamatan selama 2 bulan penyimpanan diketahui tidak terjadi perubahan pH selama penyimpanan untuk keempat variabel. Ukuran partikel

  19. Physics of type Ia supernovae

    International Nuclear Information System (INIS)

    Hoeflich, Peter

    2006-01-01

    The last decade has witnessed an explosive growth of high-quality data for thermonuclear explosions of a white dwarf star, the type Ia supernovae (SNe Ia). Advances in computational methods provide new insights into the physics of the phenomenon and a direct, quantitative link between observables and explosion physics. Both trends combined provided spectacular results, allowed to address, to identify specific problems and to narrow down the range of scenarios. Current topics include the relation between SNe Ia and their progenitors, the influence of the metallicities and accretion on the explosion, and details of the burning front. How can we understand the apparent homogeneity and probe for the diversity of SNe Ia? Here, we want give an overview of the current status of our understanding of supernovae physics in light of recent results

  20. Nanophase materials produced by physical methods

    International Nuclear Information System (INIS)

    Noda, Shoji

    1992-01-01

    A nanophase material is mainly characterized by the component's size and the large interface area. Some nanophase materials are briefly described. Ion implantation and oblique vapor deposition are taken as the methods to provide nanophase materials, and their features are described. These physical methods are non-equilibrium material processes, and the unique nanophase materials are demonstrated to be provided by these methods with little thermodynamic restriction. (author)

  1. Project as an education method in teaching of physics

    OpenAIRE

    ŽAHOUREK, Martin

    2011-01-01

    The diploma thesis ?Project as an educational method for teaching physics ?deals with the possibilities of using project-based method for teaching physics at primary schools. Not only does it contain the theoretical background of project-based teaching, but also deals with practical issues in the form of an implementation of a chosen project ?Physics and physical education?. The aim of said project was to evaluate the efficiency of project-based teaching as far as the knowledge of pupils and ...

  2. Investigation of dynamics of discrete framed structures by a numerical wave-based method and an analytical homogenization approach

    Directory of Open Access Journals (Sweden)

    Changwei Zhou

    2017-02-01

    Full Text Available In this article, the analytical homogenization method of periodic discrete media (HPDM and the numerical condensed wave finite element method (CWFEM are employed to study the longitudinal and transverse vibrations of framed structures. The valid frequency range of the HPDM is re-evaluated using the wave propagation feature identified by the CWFEM. The relative error of the wavenumber by the HPDM compared to that by the CWFEM is illustrated in functions of frequency and scale ratio. A parametric study on the thickness of the structure is carried out where the dispersion relation and the relative error are given for three different thicknesses. The dynamics of a finite structure such as natural frequency and forced response are also investigated using the HPDM and the CWFEM.

  3. Homogenization of Classification Functions Measurement (HOCFUN): A Method for Measuring the Salience of Emotional Arousal in Thinking.

    Science.gov (United States)

    Tonti, Marco; Salvatore, Sergio

    2015-01-01

    The problem of the measurement of emotion is a widely debated one. In this article we propose an instrument, the Homogenization of Classification Functions Measure (HOCFUN), designed for assessing the influence of emotional arousal on a rating task consisting of the evaluation of a sequence of images. The instrument defines an indicator (κ) that measures the degree of homogenization of the ratings given over 2 rating scales (pleasant-unpleasant and relevant-irrelevant). Such a degree of homogenization is interpreted as the effect of emotional arousal on thinking and therefore lends itself to be used as a marker of emotional arousal. A preliminary study of validation was implemented. The association of the κ indicator with 3 additional indicators was analyzed. Consistent with the hypotheses, the κ indicator proved to be associated, even if weakly and nonlinearly, with a marker of the homogenization of classification functions derived from a separate rating task and with 2 indirect indicators of emotional activation: the speed of performance on the HOCFUN task and an indicator of mood intensity. Taken as a whole, such results provide initial evidence supporting the HOCFUN construct validity.

  4. Novel method to ascertain chromatin accessibility at specific genomic loci from frozen brain homogenates and laser capture microdissected defined cells

    Directory of Open Access Journals (Sweden)

    Elaine Delvaux

    2016-06-01

    Full Text Available We describe a novel method for assessing the “open” or “closed” state of chromatin at selected locations within the genome. This method combines the use of Benzonase, which can digest DNA in the presence of actin, with quantitative polymerase chain reaction to define digested regions. We demonstrate the application of this method in brain homogenates and laser captured cells. We also demonstrate application to selected sites within more than 1 gene and multiple sites within 1 gene. We demonstrate the validity of the method by treating cells with valproate, known to render chromatin more permissive, and by comparison with classical digestion with DNase I in an in vitro preparation. Although we demonstrate the use of this method in brain tissue, we also recognize its applicability to other tissue types.

  5. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  6. Collective variables method in relativistic theory

    International Nuclear Information System (INIS)

    Shurgaya, A.V.

    1983-01-01

    Classical theory of N-component field is considered. The method of collective variables accurately accounting for conservation laws proceeding from invariance theory under homogeneous Lorentz group is developed within the frames of generalized hamiltonian dynamics. Hyperboloids are invariant surfaces Under the homogeneous Lorentz group. Proceeding from this, field transformation is introduced, and the surface is parametrized so that generators of the homogeneous Lorentz group do not include components dependent on interaction and their effect on the field function is reduced to geometrical. The interaction is completely included in the expression for the energy-momentum vector of the system which is a dynamical value. Gauge is chosen where parameters of four-dimensional translations and their canonically-conjugated pulses are non-physical and thus phase space is determined by parameters of the homogeneous Lorentz group, field function and their canonically-conjugated pulses. So it is managed to accurately account for conservation laws proceeding from the requirement of lorentz-invariance

  7. Physical changes in bovine lens homogenate following ultraviolet irradiation and their prevention by some compounds

    International Nuclear Information System (INIS)

    Ohmori, Shinji; Nose, Hideko

    1985-01-01

    Exposure of the dialyzed supernatant of bovine lens homogenate to ultraviolet (UV) light led to increases in its turbidity, pigmentation, and viscosity. These photochemically induced alterations of lens proteins were prevented by glutathione, cysteine and N-acetylcysteine, but not by ascorbic acid, S-(1,2-dicarboxyethyl)-glutathione or dulcitol. (author)

  8. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    Science.gov (United States)

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  9. Physical acoustics v.8 principles and methods

    CERN Document Server

    Mason, Warren P

    1971-01-01

    Physical Acoustics: Principles and Methods, Volume VIII discusses a number of themes on physical acoustics that are divided into seven chapters. Chapter 1 describes the principles and applications of a tool for investigating phonons in dielectric crystals, the spin phonon spectrometer. The next chapter discusses the use of ultrasound in investigating Landau quantum oscillations in the presence of a magnetic field and their relation to the strain dependence of the Fermi surface of metals. The third chapter focuses on the ultrasonic measurements that are made by pulsing methods with velo

  10. Homogeneous Spaces and Equivariant Embeddings

    CERN Document Server

    Timashev, DA

    2011-01-01

    Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em

  11. A new formulation for the problem of fuel cell homogenization

    International Nuclear Information System (INIS)

    Chao, Y.-A.; Martinez, A.S.

    1982-01-01

    A new homogenization method for reactor cells is described. This new method consists in eliminating the NR approximation for the fuel resonance and the Wigner approximation for the resonance escape probability; the background cross section is then redefined and the problem studied is reanalyzed. (E.G.) [pt

  12. A consistent homogenization procedure to obtain few-group cell parameters

    International Nuclear Information System (INIS)

    Pierini, G.

    1979-01-01

    The criterion, according to which one heterogeneous and one homogeneous cell are equivalent if they have the same boundary values of both the flux and the normal components of the current, is used to homogenize radially an axially infinite cylindrical cell, with azimuth independent properties and moderatur adequately described by diffusion theory. The method, which leads to the definition of a full matrix of diffusion coefficients, provides a new and simple definition of the few-group cell parameters, which are nearly independent of the environment. (orig.) [de

  13. Homogenization of metasurfaces formed by random resonant particles in periodical lattices

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Lavrinenko, Andrei; Petrov, Mihail

    2016-01-01

    In this paper we suggest a simple analytical method for description of electromagnetic properties of a geometrically regular two-dimensional subwavelength arrays (metasurfaces) formed by particles with randomly fluctuating polarizabilities. We propose an analytical homogenization method applicable...

  14. Physical-chemical property based sequence motifs and methods regarding same

    Science.gov (United States)

    Braun, Werner [Friendswood, TX; Mathura, Venkatarajan S [Sarasota, FL; Schein, Catherine H [Friendswood, TX

    2008-09-09

    A data analysis system, program, and/or method, e.g., a data mining/data exploration method, using physical-chemical property motifs. For example, a sequence database may be searched for identifying segments thereof having physical-chemical properties similar to the physical-chemical property motifs.

  15. Homogenization on Multi-Materials’ Elements: Application to Printed Circuit Boards and Warpage Analysis

    Directory of Open Access Journals (Sweden)

    Araújo Manuel

    2016-01-01

    Full Text Available Multi-material domains are often found in industrial applications. Modelling them can be computationally very expensive due to meshing requirements. The finite element properties comprising different materials are hardly accurate. In this work, a new homogenization method that simplifies the computation of the homogenized Young modulus, Poisson ratio and thermal expansion coefficient is proposed, and applied to composite-like material on a printed circuit board. The results show a good properties correspondence between the homogenized domain and the real geometry simulation.

  16. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    Science.gov (United States)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and

  17. Internal homogenization: effective permittivity of a coated sphere.

    Science.gov (United States)

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  18. Fault Diagnosis of Supervision and Homogenization Distance Based on Local Linear Embedding Algorithm

    Directory of Open Access Journals (Sweden)

    Guangbin Wang

    2015-01-01

    Full Text Available In view of the problems of uneven distribution of reality fault samples and dimension reduction effect of locally linear embedding (LLE algorithm which is easily affected by neighboring points, an improved local linear embedding algorithm of homogenization distance (HLLE is developed. The method makes the overall distribution of sample points tend to be homogenization and reduces the influence of neighboring points using homogenization distance instead of the traditional Euclidean distance. It is helpful to choose effective neighboring points to construct weight matrix for dimension reduction. Because the fault recognition performance improvement of HLLE is limited and unstable, the paper further proposes a new local linear embedding algorithm of supervision and homogenization distance (SHLLE by adding the supervised learning mechanism. On the basis of homogenization distance, supervised learning increases the category information of sample points so that the same category of sample points will be gathered and the heterogeneous category of sample points will be scattered. It effectively improves the performance of fault diagnosis and maintains stability at the same time. A comparison of the methods mentioned above was made by simulation experiment with rotor system fault diagnosis, and the results show that SHLLE algorithm has superior fault recognition performance.

  19. Measuring the homogeneity of Bi(2223)/Ag tapes by four-probe method and a Hall probe array

    International Nuclear Information System (INIS)

    Kovac, P.

    1999-01-01

    The nature of the BSCCO compound and application of the powder-in-tube technique usually lead to non-uniform quality across and/or along the ceramic fibres and finally to variations in the critical current and its irregular distribution in the Bi(2223)/Ag tape. Therefore, the gliding four-probe method and contactless field monitoring measurements have been used for homogeneity studies. The gliding potential contacts moved along the tape surface and a sensitive system based on an integrated Hall probe array containing 16 or 19 in-line probes supported by PC-compatible electronics with software allowed us to make a comparison of contact and contactless measurements at any elements of Bi(2223)/Ag sample. The results of both methods show very good correlation and the possibility of using a sensitive Hall probe array for monitoring the final quality of Bi(2223)/Ag tapes. (author)

  20. Homogeneous versus heterogeneous zeolite nucleation

    NARCIS (Netherlands)

    Dokter, W.H.; Garderen, van H.F.; Beelen, T.P.M.; Santen, van R.A.; Bras, W.

    1995-01-01

    Aggregates of fractal dimension were found in the intermediate gel phases that organize prior to nucleation and crystallization (shown right) of silicalite from a homogeneous reaction mixture. Small- and wide-angle X-ray scattering studies prove that for zeolites nucleation may be homogeneous or

  1. A unidirectional acoustic cloak for multilayered background media with homogeneous metamaterials

    Science.gov (United States)

    Zhu, Jian; Chen, Tianning; Liang, Qingxuan; Wang, Xiaopeng; Xiong, Jie; Jiang, Ping

    2015-08-01

    The acoustic cloak, which can make an object hard to detect acoustically in a homogeneous background, has attracted great attention from researchers in recent years. The inhomogeneous background media were considered in this paper. The relative constitutive parameters were derived for acoustic cloaks working in multilayered media. And a unidirectional acoustic cloak for layered background media was proposed, designed and implemented successfully in a wide frequency range. In water and NaCl aqueous solution, the acoustic cloak was designed and realized with homogeneous metamaterials which were composed of steel and porous materials. The effective parameters of the unit cells of the cloak were determined by using the effective medium theory. Numerical results demonstrated excellent cloaking performance and showed that such a device could be physically realized with natural materials which will greatly promote the real applications of an invisibility cloak in inhomogeneous backgrounds.

  2. Homogenization of long fiber reinforced composites including fiber bending effects

    DEFF Research Database (Denmark)

    Poulios, Konstantinos; Niordson, Christian Frithiof

    2016-01-01

    This paper presents a homogenization method, which accounts for intrinsic size effects related to the fiber diameter in long fiber reinforced composite materials with two independent constitutive models for the matrix and fiber materials. A new choice of internal kinematic variables allows...... of the reinforcing fibers is captured by higher order strain terms, resulting in an accurate representation of the micro-mechanical behavior of the composite. Numerical examples show that the accuracy of the proposed model is very close to a non-homogenized finite-element model with an explicit discretization...

  3. Multi-Level iterative methods in computational plasma physics

    International Nuclear Information System (INIS)

    Knoll, D.A.; Barnes, D.C.; Brackbill, J.U.; Chacon, L.; Lapenta, G.

    1999-01-01

    Plasma physics phenomena occur on a wide range of spatial scales and on a wide range of time scales. When attempting to model plasma physics problems numerically the authors are inevitably faced with the need for both fine spatial resolution (fine grids) and implicit time integration methods. Fine grids can tax the efficiency of iterative methods and large time steps can challenge the robustness of iterative methods. To meet these challenges they are developing a hybrid approach where multigrid methods are used as preconditioners to Krylov subspace based iterative methods such as conjugate gradients or GMRES. For nonlinear problems they apply multigrid preconditioning to a matrix-few Newton-GMRES method. Results are presented for application of these multilevel iterative methods to the field solves in implicit moment method PIC, multidimensional nonlinear Fokker-Planck problems, and their initial efforts in particle MHD

  4. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    Science.gov (United States)

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  5. Fuel micro-mechanics: homogenization, cracking, granular media

    International Nuclear Information System (INIS)

    Monerie, Yann

    2010-01-01

    This work summarizes about fifteen years of research in the field of micro-mechanics of materials. Emphasis is placed on the most recent work carried out in the context of nuclear safety. Micro-mechanics finds a natural place there, aiming to predict the behavior of heterogeneous materials with an evolving microstructure. The applications concerned mainly involve the nuclear fuel and its tubular cladding. The uranium dioxide fuel is modeled, according to the scales under consideration, as a porous ceramic or a granular medium. The strongly irradiated Zircaloy claddings are identified with a composite medium with a metal matrix and a gradient of properties. The analysis of these classes of material is rich in problems of a more fundamental nature. Three main themes are discussed: 1/ Homogenization, 2/ cracking, rupture and fragmentation, 3/ discrete media and fluid-grain couplings. Homogenization: The analytical scale change methods proposed aim to estimate or limit the linear and equivalent nonlinear behaviors of isotropic porous media and anisotropic composites with a metal matrix. The porous media under consideration are saturated or drained, with a compressible or incompressible matrix, and have one or two scales of spherical or ellipsoid pores, or cracks. The composites studied have a macroscopic anisotropy related to that of the matrix, and to the shape and spatial distribution of the inclusions. Thermoelastic, elastoplastic, and viscoplastic behaviors and ductile damage of these media are examined using different techniques: extensions of classic approaches, linear in particular, variational approaches and approaches using elliptical potentials with thermally activated elementary mechanisms. The models developed are validated on numerical finite element simulations, and their functional relevance is illustrated in comparison to experimental data obtained from the literature. The significant results obtained include a plasticity criterion for Gurson matrix

  6. Endoscopy and homogeneous-heterogeneous reactions in MHD radiative peristaltic activity of Ree-Eyring fluid

    Science.gov (United States)

    Hayat, Tasawar; Akram, Javaria; Alsaedi, Ahmed; Zahir, Hina

    2018-03-01

    Endoscopic and homogeneous-heterogeneous reactions in MHD peristalsis of Ree-Eyring fluid are addressed. Mathematical modeling and analysis have been performed by utilizing cylindrical coordinates. Nonlinear thermal radiation is present. Impact of slip boundary conditions on temperature and velocity on outer tube are taken into consideration. Lubrication approach is employed. The nonlinear system is executed numerically for solutions of velocity, temperature and concentration. Graphical results are obtained to predict physical interpretation of various embedded parameters. It is noted that homogeneous and heterogeneous reactions affect the concentration alternatively. Moreover Brinkman number rises the temperature and heat transfer coefficient whereas thermal slip drops temperature and heat transfer rate.

  7. Development of triple scale finite element analyses based on crystallographic homogenization methods

    International Nuclear Information System (INIS)

    Nakamachi, Eiji

    2004-01-01

    Crystallographic homogenization procedure is implemented in the piezoelectric and elastic-crystalline plastic finite element (FE) code to assess its macro-continuum properties of piezoelectric ceramics and BCC and FCC sheet metals. Triple scale hierarchical structure consists of an atom cluster, a crystal aggregation and a macro- continuum. In this paper, we focus to discuss a triple scale numerical analysis for piezoelectric material, and apply to assess a macro-continuum material property. At first, we calculate material properties of Perovskite crystal of piezoelectric material, XYO3 (such as BaTiO3 and PbTiO3) by employing ab-initio molecular analysis code CASTEP. Next, measured results of SEM and EBSD observations of crystal orientation distributions, shapes and boundaries of a real material (BaTiO3) are employed to define an inhomogeneity of crystal aggregation, which corresponds to a unit cell of micro-structure, and satisfies the periodicity condition. This procedure is featured as a first scaling up from the molecular to the crystal aggregation. Finally, the conventional homogenization procedure is implemented in FE code to evaluate a macro-continuum property. This final procedure is featured as a second scaling up from the crystal aggregation (unit cell) to macro-continuum. This triple scale analysis is applied to design piezoelectric ceramic and finds an optimum crystal orientation distribution, in which a macroscopic piezoelectric constant d33 has a maximum value

  8. Sewage sludge solubilization by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  9. Mathematical methods in physics distributions, Hilbert space operators, variational methods, and applications in quantum physics

    CERN Document Server

    Blanchard, Philippe

    2015-01-01

    The second edition of this textbook presents the basic mathematical knowledge and skills that are needed for courses on modern theoretical physics, such as those on quantum mechanics, classical and quantum field theory, and related areas.  The authors stress that learning mathematical physics is not a passive process and include numerous detailed proofs, examples, and over 200 exercises, as well as hints linking mathematical concepts and results to the relevant physical concepts and theories.  All of the material from the first edition has been updated, and five new chapters have been added on such topics as distributions, Hilbert space operators, and variational methods.   The text is divided into three main parts. Part I is a brief introduction to distribution theory, in which elements from the theories of ultradistributions and hyperfunctions are considered in addition to some deeper results for Schwartz distributions, thus providing a comprehensive introduction to the theory of generalized functions. P...

  10. Computer programs of information processing of nuclear physical methods as a demonstration material in studying nuclear physics and numerical methods

    Science.gov (United States)

    Bateev, A. B.; Filippov, V. P.

    2017-01-01

    The principle possibility of using computer program Univem MS for Mössbauer spectra fitting as a demonstration material at studying such disciplines as atomic and nuclear physics and numerical methods by students is shown in the article. This program is associated with nuclear-physical parameters such as isomer (or chemical) shift of nuclear energy level, interaction of nuclear quadrupole moment with electric field and of magnetic moment with surrounded magnetic field. The basic processing algorithm in such programs is the Least Square Method. The deviation of values of experimental points on spectra from the value of theoretical dependence is defined on concrete examples. This value is characterized in numerical methods as mean square deviation. The shape of theoretical lines in the program is defined by Gaussian and Lorentzian distributions. The visualization of the studied material on atomic and nuclear physics can be improved by similar programs of the Mössbauer spectroscopy, X-ray Fluorescence Analyzer or X-ray diffraction analysis.

  11. Mathematical methods in engineering and physics

    CERN Document Server

    Felder, Gary N

    2016-01-01

    This text is intended for the undergraduate course in math methods, with an audience of physics and engineering majors. As a required course in most departments, the text relies heavily on explained examples, real-world applications and student engagement. Supporting the use of active learning, a strong focus is placed upon physical motivation combined with a versatile coverage of topics that can be used as a reference after students complete the course. Each chapter begins with an overview that includes a list of prerequisite knowledge, a list of skills that will be covered in the chapter, and an outline of the sections. Next comes the motivating exercise, which steps the students through a real-world physical problem that requires the techniques taught in each chapter.

  12. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  13. Effects of fat content, pasteurization method, homogenization pressure, and storage time on the mechanical and sensory properties of bovine milk.

    Science.gov (United States)

    Li, Y; Joyner, H S; Carter, B G; Drake, M A

    2018-04-01

    Fluid milk may be pasteurized by high-temperature short-time pasteurization (HTST) or ultrapasteurization (UP). Literature suggests that UP increases milk astringency, but definitive studies have not demonstrated this effect. Thus, the objective of this study was to determine the effects of pasteurization method, fat content, homogenization pressure, and storage time on milk sensory and mechanical behaviors. Raw skim (fat), 2%, and 5% fat milk was pasteurized in duplicate by indirect UP (140°C, 2.3 s) or by HTST pasteurization (78°C, 15 s), homogenized at 20.7 MPa, and stored at 4°C for 8 wk. Additionally, 2% fat milk was processed by indirect UP and homogenized at 13.8, 20.7, and 27.6 MPa and stored at 4°C for 8 wk. Sensory profiling, instrumental viscosity, and friction profiles of all milk were evaluated at 25°C after storage times of 1, 4, and 8 wk. Sodium dodecyl sulfate PAGE and confocal laser scanning microscopy were used to determine protein structural changes in milk at these time points. Fresh HTST milk was processed at wk 7 for wk 8 evaluations. Ultrapasteurization increased milk sensory and instrumental viscosity compared with HTST pasteurization. Increased fat content increased sensory and instrumental viscosity, and decreased astringency and friction profiles. Astringency, mixed regimen friction profiles, and sensory viscosity also increased for UP versus HTST. Increased storage time showed no effect on sensory viscosity or mechanical viscosity. However, increased storage time generally resulted in increased friction profiles and astringency. Sodium dodecyl sulfate PAGE and confocal laser scanning microscopy showed increased denatured whey protein in UP milk compared with HTST milk. The aggregates or network formed by these proteins and casein micelles likely caused the increase in viscosity and friction profiles during storage. Homogenization pressure did not significantly affect friction behaviors, mechanical viscosity, or astringency; however

  14. Color Segmentation of Homogeneous Areas on Colposcopical Images

    Directory of Open Access Journals (Sweden)

    Kosteley Yana

    2016-01-01

    Full Text Available The article provides an analysis of image processing and color segmentation applied to the problem of selection of homogeneous regions in the parameters of the color model. Methods of image processing such as Gaussian filter, median filter, histogram equalization and mathematical morphology are considered. The segmentation algorithm with the parameters of color components is presented, followed by isolation of the resulting connected component of a binary segmentation mask. Analysis of methods performed on images colposcopic research.

  15. The method of sections in molecular physics

    International Nuclear Information System (INIS)

    Natarajan, P.; Zygelman, B.

    1993-01-01

    In the standard Born-Oppenheimer theory the nuclear wave-function for a bound, rotating, di-atom system is described by the Wigner functions. Unlike the spherical harmonics, the Wigner functions exhibit cusp singularities at the poles of the space-fixed coordinate system. These singularities are identical to the ones encountered in the quantum mechanics treatment of a charged particle under the influence of a magnetic monopole. In the latter case the method of sectional was introduced to eliminate the singularities. The method of sections was also introduced in molecular physics. We discuss here, in detail, their properties and advantage of using this construction in molecular physics

  16. Improving homogeneity by dynamic speed limit systems.

    NARCIS (Netherlands)

    Nes, N. van Brandenberg, S. & Twisk, D.A.M.

    2010-01-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12

  17. Modern methodic of power cardio training in students’ physical education

    Directory of Open Access Journals (Sweden)

    A.Yu. Osipov

    2016-12-01

    Full Text Available Purpose: significant increase of students’ physical condition and health level at the account of application of modern power cardio training methodic. Material: 120 students (60 boys and 60 girls participated in the research. The age of the tested was 19 years. The research took one year. We used methodic of power and functional impact on trainees’ organism (HOT IRON. Such methodic is some systems of physical exercises with weights (mini-barbells, to be fulfilled under accompaniment of specially selected music. Results: we showed advantages of power-cardio and fitness trainings in students’ health improvement and in elimination obesity. Control tests showed experimental group students achieved confidently higher physical indicators. Boys demonstrated increase of physical strength and general endurance indicators. Girls had confidently better indicators of physical strength, flexibility and general endurance. Increase of control group students’ body mass can be explained by students’ insufficient physical activity at trainings, conducted as per traditional program. Conclusions: students’ trainings by power-cardio methodic with application HOT IRON exercises facilitate development the following physical qualities: strength and endurance in boys and strength, flexibility and endurance in girls. Besides, it was found that such systems of exercises facilitate normalization of boys’ body mass and correction of girls’ constitution.

  18. Homogeneity and thermodynamic identities in geometrothermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)

    2017-03-15

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)

  19. A unidirectional acoustic cloak for multilayered background media with homogeneous metamaterials

    International Nuclear Information System (INIS)

    Zhu, Jian; Chen, Tianning; Liang, Qingxuan; Wang, Xiaopeng; Xiong, Jie; Jiang, Ping

    2015-01-01

    The acoustic cloak, which can make an object hard to detect acoustically in a homogeneous background, has attracted great attention from researchers in recent years. The inhomogeneous background media were considered in this paper. The relative constitutive parameters were derived for acoustic cloaks working in multilayered media. And a unidirectional acoustic cloak for layered background media was proposed, designed and implemented successfully in a wide frequency range. In water and NaCl aqueous solution, the acoustic cloak was designed and realized with homogeneous metamaterials which were composed of steel and porous materials. The effective parameters of the unit cells of the cloak were determined by using the effective medium theory. Numerical results demonstrated excellent cloaking performance and showed that such a device could be physically realized with natural materials which will greatly promote the real applications of an invisibility cloak in inhomogeneous backgrounds. (paper)

  20. Using the Case Study Method in Teaching College Physics

    Science.gov (United States)

    Burko, Lior M.

    2016-01-01

    The case study teaching method has a long history (starting at least with Socrates) and wide current use in business schools, medical schools, law schools, and a variety of other disciplines. However, relatively little use is made of it in the physical sciences, specifically in physics or astronomy. The case study method should be considered by…

  1. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems : Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well

  2. Homogeneity of Prototypical Attributes in Soccer Teams

    Directory of Open Access Journals (Sweden)

    Christian Zepp

    2015-09-01

    Full Text Available Research indicates that the homogeneous perception of prototypical attributes influences several intragroup processes. The aim of the present study was to describe the homogeneous perception of the prototype and to identify specific prototypical subcategories, which are perceived as homogeneous within sport teams. The sample consists of N = 20 soccer teams with a total of N = 278 athletes (age M = 23.5 years, SD = 5.0 years. The results reveal that subcategories describing the cohesiveness of the team and motivational attributes are mentioned homogeneously within sport teams. In addition, gender, identification, team size, and the championship ranking significantly correlate with the homogeneous perception of prototypical attributes. The results are discussed on the basis of theoretical and practical implications.

  3. Computer methods in physics 250 problems with guided solutions

    CERN Document Server

    Landau, Rubin H

    2018-01-01

    Our future scientists and professionals must be conversant in computational techniques. In order to facilitate integration of computer methods into existing physics courses, this textbook offers a large number of worked examples and problems with fully guided solutions in Python as well as other languages (Mathematica, Java, C, Fortran, and Maple). It’s also intended as a self-study guide for learning how to use computer methods in physics. The authors include an introductory chapter on numerical tools and indication of computational and physics difficulty level for each problem.

  4. Interpretation of criticality experiments on homogeneous solutions of plutonium and uranium; Interpretation des experiences de criticite sur des solutions homogenes de plutonium et d'uranium

    Energy Technology Data Exchange (ETDEWEB)

    Ithurralde, M F; Kremser, J; Leclerc, J; Lombard, Ch; Moreau, J; Robin, C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    Criticality experiments on solutions of fissionable materials have been carried out in tanks of various geometries (cylinder, isolated annular cylinder, interacting annular cylinders); the reflexion conditions have also been varied (without reflection, semi-reflection and total reflexion by water). The range of the studied concentrations is rather large (18,8 to 104 gms/liter). The interpretation of these experiments has been undertaken in order to resolve the problems of the industrial use of homogeneous plutonium and uranium solutions. Several methods the fields of application of which are different have been used: diffusion method, transport method and Monte-Carlo method. (authors) [French] Des experiences critiques sur des solutions de matieres fissiles ont ete faites dans des cuves de diverses geometries (cylindre, cylindre annulaire isole, cylindre annulaire en interaction), les conditions de reflexion ont ete egalement variees (sans reflexion, semi reflexion et reflexion totale par l'eau). La gamme des concentrations etudiees est assez etendue (18,8 a 104 g/l ). L'interpretation de ces experiences a ete entreprise dans le but de pouvoir resoudre les problemes poses par l'emploi industriel de solutions homogenes de plutonium et d'uranium, plusieurs methodes dont les domaines d'application sont differents ont ete employees: methode de diffusion, methode de transport, methode de Monte-Carlo. (auteurs)

  5. Homogenization theory in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1986-02-01

    The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr

  6. Application of unsupervised learning methods in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Koevesarki, Peter; Nuncio Quiroz, Adriana Elizabeth; Brock, Ian C. [Physikalisches Institut, Universitaet Bonn, Bonn (Germany)

    2011-07-01

    High energy physics is a home for a variety of multivariate techniques, mainly due to the fundamentally probabilistic behaviour of nature. These methods generally require training based on some theory, in order to discriminate a known signal from a background. Nevertheless, new physics can show itself in ways that previously no one thought about, and in these cases conventional methods give little or no help. A possible way to discriminate between known processes (like vector bosons or top-quark production) or look for new physics is using unsupervised machine learning to extract the features of the data. A technique was developed, based on the combination of neural networks and the method of principal curves, to find a parametrisation of the non-linear correlations of the data. The feasibility of the method is shown on ATLAS data.

  7. Control rod homogenization in heterogeneous sodium-cooled fast reactors

    International Nuclear Information System (INIS)

    Andersson, Mikael

    2016-01-01

    The sodium-cooled fast reactor is one of the candidates for a sustainable nuclear reactor system. In particular, the French ASTRID project employs an axially heterogeneous design, proposed in the so-called CFV (low sodium effect) core, to enhance the inherent safety features of the reactor. This thesis focuses on the accurate modeling of the control rods, through the homogenization method. The control rods in a sodium-cooled fast reactor are used for reactivity compensation during the cycle, power shaping, and to shutdown the reactor. In previous control rod homogenization procedures, only a radial description of the geometry was implemented, hence the axially heterogeneous features of the CFV core could not be taken into account. This thesis investigates the different axial variations the control rod experiences in a CFV core, to determine the impact that these axial environments have on the control rod modeling. The methodology used in this work is based on previous homogenization procedures, the so-called equivalence procedure. The procedure was newly implemented in the PARIS code system in order to be able to use 3D geometries, and thereby be take axial effects into account. The thesis is divided into three parts. The first part investigates the impact of different neutron spectra on the homogeneous control-rod cross sections. The second part investigates the cases where the traditional radial control-rod homogenization procedure is no longer applicable in the CFV core, which was found to be 5-10 cm away from any material interface. In the third part, based on the results from the second part, a 3D model of the control rod is used to calculate homogenized control-rod cross sections. In a full core model, a study is made to investigate the impact these axial effects have on control rod-related core parameters, such as the control rod worth, the capture rates in the control rod, and the power in the adjacent fuel assemblies. All results were compared to a Monte

  8. PHYSICAL METHODS IN AGRO-FOOD CHAIN

    Directory of Open Access Journals (Sweden)

    ANNA ALADJADJIYAN

    2009-06-01

    Full Text Available Chemical additives (fertilizers and plant protection preparations are largely used for improving the production yield of food produce. Their application often causes the contamination of raw materials for food production, which can be dangerous for the health of consumers. Alternative methods are developed and implemented to improve and ensure the safety of on-farm production. The substitution of chemical fertilizers and soil additives with alternative treatment methods, such as irradiation, ultrasound and the use of electromagnetic energy are discussed. Successful application of physical methods in different stages of food-preparation is recommended.

  9. Scattering theory in quantum mechanics. Physical principles and mathematical methods

    International Nuclear Information System (INIS)

    Amrein, W.O.; Jauch, J.M.; Sinha, K.B.

    1977-01-01

    A contemporary approach is given to the classical topics of physics. The purpose is to explain the basic physical concepts of quantum scattering theory, to develop the necessary mathematical tools for their description, to display the interrelation between the three methods (the Schroedinger equation solutions, stationary scattering theory, and time dependence) to derive the properties of various quantities of physical interest with mathematically rigorous methods

  10. High homogeneity powder of Ti-Ba-Ca-Cu-O (2223) prepared by Freeze-Drying method

    International Nuclear Information System (INIS)

    Al-Shakarchi, Emad Kh.; Toma, Ziad A.

    1999-01-01

    Full text.Homogeneous high temerature superconductor ceramic powder of TI-Ba-Ca-Cu-O with transition temperature [Tc=123K] have been successfully prepared from the mixture of nitrate salts [TlNO 3 , Ba(NO 3 ) 2 , Ca(NO 3 ) 2 .4H 2 O and Cu(NO 3 ) 2 .3H 2 O] by using freeze-drying method. Freeze-dryer that was used in this work designed locally in our laboratory. This technique consider a better to get a fine powder of ceramic materials by depending on the procedure of frozen droplets with present of liquid nitrogen. SEM pictures showed the size of grains of about [0.8 μm]. We conclude that the high sintering temperature, for the prepared powders in this technique, for long time [120 hrs] will increase the inter diffusion between the grains ahich caused the decreasing in the density of the sample which may be given a better results than the obtained in a previous works

  11. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Determining the degree of powder homogeneity using PC-based program

    Directory of Open Access Journals (Sweden)

    Đuragić Olivera M.

    2010-01-01

    Full Text Available The mixing of powders and the quality control of the obtained mixtures are critical operations involved in the processing of granular materials in chemical, metallurgical, food and pharmaceutical industries. Studies on mixing efficiency and the time needed for achieving homogeneity in the powder mashes production have significant importance. Depending on the characteristic of the materials, a number of methods have been used for the homogeneity tests. Very often, the degree of mixing has been determined by analyzing images of particle arrays in the sample using microscopy, photography and/or video tools. In this paper, a new PC-based method for determining the number of particles in the powder homogeneity tests has been developed. Microtracers®, red iron particles, were used as external tracer added before mixing. Iron particles in the samples of the mixtures were separated by rotary magnet and spread onto a filter paper. The filter paper was sprayed with 50% solution of ethanol for color development and the particles counted where the number of spots presented the concentration of added tracer. The number of spots was counted manually, as well as by the developed PC program. The program which analyzes scanned filter papers with spots is based on digital image analyses, where red spots were converted through few filters into a black and white, and counted. Results obtained by manual and PC counting were compared. A high correlation was established between the two counting methods.

  13. Layered Fiberconcrete with Non-Homogeneous Fibers Distribution

    OpenAIRE

    Lūsis, V; Krasņikovs, A

    2013-01-01

    The aim of present research is to create fiberconcrete construction with non-homogeneous fibers distribution in it. Traditionally fibers are homogeneously dispersed in a concrete. At the same time in many situations fiberconcretes with homogeneously dispersed fibers are not optimal (majority of added fibers are not participating in a loads bearing process).

  14. Plane wave interaction with a homogeneous warm plasma sphere

    International Nuclear Information System (INIS)

    Ruppin, R.

    1975-01-01

    A Mie type theory for the scattering and absorption properties of a homogeneous warm plasma sphere is developed. The theory is applied to the calculation of the extinction cross section of plasma spheres, and the effects of Landau damping and collisional damping on the spectra are discussed. The dependence of the main resonance and of the Tonks-Dattner resonances on the physical parameters characterizing the sphere and its surroundings is investigated. The spectrum is shown to be insenitive to the boundary conditions which specify the behaviour of the electrons at the surface of the sphere (author)

  15. Hypersurface Homogeneous Cosmological Model in Modified Theory of Gravitation

    Science.gov (United States)

    Katore, S. D.; Hatkar, S. P.; Baxi, R. J.

    2016-12-01

    We study a hypersurface homogeneous space-time in the framework of the f (R, T) theory of gravitation in the presence of a perfect fluid. Exact solutions of field equations are obtained for exponential and power law volumetric expansions. We also solve the field equations by assuming the proportionality relation between the shear scalar (σ ) and the expansion scalar (θ ). It is observed that in the exponential model, the universe approaches isotropy at large time (late universe). The investigated model is notably accelerating and expanding. The physical and geometrical properties of the investigated model are also discussed.

  16. Solutes transport in unsaturated double-porosity medium. Modelling by homogenization and applications

    International Nuclear Information System (INIS)

    Tran Ngoc, T.D.

    2008-07-01

    This Ph.D thesis presents the development of the solute transport models in unsaturated double-porosity medium, by using the asymptotic homogenization method. The obtained macroscopic models concern diffusion, diffusion-convection and dispersion-convection, according to the transport regime which is characterized by the non-dimensional numbers. The models consist of two coupled equations that show the local non-equilibrium of concentrations. The double-porosity transport models were numerically implemented using the code COMSOL Multiphysics (finite elements method), and compared with the solution of the same problem at the fine scale. The implementation allows solving the coupled equations in the macro- and micro-porosity domains (two-scale computations). The calculations of the dispersion tensor as a solution of the local boundary value problems, were also conducted. It was shown that the dispersivity depends on the saturation, the physical properties of the macro-porosity domain and the internal structure of the double-porosity medium. Finally, two series of experiments were performed on a physical model of double-porosity that is composed of a periodic assemblage of sintered clay spheres in Hostun sand HN38. The first experiment was a drainage experiment, which was conducted in order to validate the unsaturated flow model. The second series was a dispersion experiment in permanent unsaturated water flow condition (water content measured by gamma ray attenuation technique). A good agreement between the numerical simulations and the experimental observations allows the validation of the developed models. (author)

  17. Systematic homogenization and self-consistent flux and pin power reconstruction for nodal diffusion methods. 1: Diffusion equation-based theory

    International Nuclear Information System (INIS)

    Zhang, H.; Rizwan-uddin; Dorning, J.J.

    1995-01-01

    A diffusion equation-based systematic homogenization theory and a self-consistent dehomogenization theory for fuel assemblies have been developed for use with coarse-mesh nodal diffusion calculations of light water reactors. The theoretical development is based on a multiple-scales asymptotic expansion carried out through second order in a small parameter, the ratio of the average diffusion length to the reactor characteristic dimension. By starting from the neutron diffusion equation for a three-dimensional heterogeneous medium and introducing two spatial scales, the development systematically yields an assembly-homogenized global diffusion equation with self-consistent expressions for the assembly-homogenized diffusion tensor elements and cross sections and assembly-surface-flux discontinuity factors. The rector eigenvalue 1/k eff is shown to be obtained to the second order in the small parameter, and the heterogeneous diffusion theory flux is shown to be obtained to leading order in that parameter. The latter of these two results provides a natural procedure for the reconstruction of the local fluxes and the determination of pin powers, even though homogenized assemblies are used in the global nodal diffusion calculation

  18. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    2015-01-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed. This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.

  19. A paradigm for discrete physics

    International Nuclear Information System (INIS)

    Noyes, H.P.; McGoveran, D.; Etter, T.; Manthey, M.J.; Gefwert, C.

    1987-01-01

    An example is outlined for constructing a discrete physics using as a starting point the insight from quantum physics that events are discrete, indivisible and non-local. Initial postulates are finiteness, discreteness, finite computability, absolute nonuniqueness (i.e., homogeneity in the absence of specific cause) and additivity

  20. Cosmic Ray Hit Detection with Homogenous Structures

    Science.gov (United States)

    Smirnov, O. M.

    Cosmic ray (CR) hits can affect a significant number of pixels both on long-exposure ground-based CCD observations and on the Space Telescope frames. Thus, methods of identifying the damaged pixels are an important part of the data preprocessing for practically any application. The paper presents an implementation of a CR hit detection algorithm based on a homogenous structure (also called cellular automata ), a concept originating in artificial intelligence and dicrete mathematics. Each pixel of the image is represented by a small automaton, which interacts with its neighbors and assumes a distinct state if it ``decides'' that a CR hit is present. On test data, the algorithm has shown a high detection rate (~0.7 ) and a low false alarm rate (frame. A homogenous structure is extremely trainable, which can be very important for processing large batches of data obtained under similar conditions. Training and optimizing issues are discussed, as well as possible other applications of this concept to image processing.

  1. High-frequency homogenization of zero frequency stop band photonic and phononic crystals

    CERN Document Server

    Antonakakis, Tryfon; Guenneau, Sebastien

    2013-01-01

    We present an accurate methodology for representing the physics of waves, for periodic structures, through effective properties for a replacement bulk medium: This is valid even for media with zero frequency stop-bands and where high frequency phenomena dominate. Since the work of Lord Rayleigh in 1892, low frequency (or quasi-static) behaviour has been neatly encapsulated in effective anisotropic media. However such classical homogenization theories break down in the high-frequency or stop band regime. Higher frequency phenomena are of significant importance in photonics (transverse magnetic waves propagating in infinite conducting parallel fibers), phononics (anti-plane shear waves propagating in isotropic elastic materials with inclusions), and platonics (flexural waves propagating in thin-elastic plates with holes). Fortunately, the recently proposed high-frequency homogenization (HFH) theory is only constrained by the knowledge of standing waves in order to asymptotically reconstruct dispersion curves an...

  2. Geometric Methods in Physics : XXXII Workshop

    CERN Document Server

    Bieliavsky, Pierre; Odesskii, Alexander; Odzijewicz, Anatol; Schlichenmaier, Martin; Voronov, Theodore; Geometric Methods in Physics

    2014-01-01

    The Białowieża Workshops on Geometric Methods in Physics, which are hosted in the unique setting of the Białowieża natural forest in Poland, are among the most important meetings in the field. Every year some 80 to 100 participants from both the mathematics and physics world join to discuss new developments and to exchange ideas. The current volume was produced on the occasion of the 32nd meeting in 2013. It is now becoming a tradition that the Workshop is followed by a School on Geometry and Physics, which consists of advanced lectures for graduate students and young researchers. Selected speakers at the 2013 Workshop were asked to contribute to this book, and their work was supplemented by additional review articles. The selection shows that, despite its now long tradition, the workshop remains at the cutting edge of research. The 2013 Workshop also celebrated the 75th birthday of Daniel Sternheimer, and on this occasion the discussion mainly focused on his contributions to mathematical physics such as ...

  3. Regional homogeneity of electoral space: comparative analysis (on the material of 100 national cases

    Directory of Open Access Journals (Sweden)

    A. O. Avksentiev

    2015-12-01

    Full Text Available In the article the author examines dependence on electoral behavior from territorial belonging. «Regional homogeneity» and «electoral space» categories are conceptualized. It is argued, that such regional homogeneity is a characteristic of electoral space and can be quantified. Quantitative measurement of government regional homogeneity has direct connection with risk of separatism, civil conflicts, or legitimacy crisis on deviant territories. It is proposed the formulae for evaluation of regional homogeneity quantitative method which has been based on statistics analysis instrument, especially, variation coefficient. Possible directions of study with the use of this index according to individual political subjects and the whole political space (state, region, electoral district are defined. Calculation of appropriate indexes for Ukrainian electoral space (return of 1991­2015 elections and 100 other national cases. The dynamics of Ukraine regional homogeneity on the material of 1991­2015 electoral statistics is analyzed.

  4. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  5. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    Science.gov (United States)

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. A computational analysis on homogeneous-heterogeneous mechanism in Carreau fluid flow

    Science.gov (United States)

    Khan, Imad; Rehman, Khalil Ur; Malik, M. Y.; Shafquatullah

    2018-03-01

    In this article magnetohydrodynamic Carreau fluid flow towards stretching cylinder is considered in the presence of homogeneous-heterogeneous reactions effect. The flow model is structured by utilizing theoretical grounds. For the numerical solution a shooting method along with Runge-Kutta algorithm is executed. The outcomes are provided through graphs. It is observed that the Carreau fluid concentration shows decline values via positive iterations of homogeneous-heterogeneous reaction parameters towards both shear thinning and thickening case. The present work is certified through comparison with already existing literature in a limiting sense.

  7. Optimal truss and frame design from projected homogenization-based topology optimization

    DEFF Research Database (Denmark)

    Larsen, S. D.; Sigmund, O.; Groen, J. P.

    2018-01-01

    In this article, we propose a novel method to obtain a near-optimal frame structure, based on the solution of a homogenization-based topology optimization model. The presented approach exploits the equivalence between Michell’s problem of least-weight trusses and a compliance minimization problem...... using optimal rank-2 laminates in the low volume fraction limit. In a fully automated procedure, a discrete structure is extracted from the homogenization-based continuum model. This near-optimal structure is post-optimized as a frame, where the bending stiffness is continuously decreased, to allow...

  8. A theoretical study on the influence of the homogeneity of heavy-ion irradiation field on the survival fraction of cells

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wang Jufang; Wei Zengquan

    2001-01-01

    In order to provide theoretical basis for the homogeneity request of heavy-ion irradiation field, the most important design parameter of the heavy-ion radiotherapy facility planned in IMP (Institute of Modern Physics), the influence of the homogeneity of heavy-ion irradiation field on the survival fraction of cells was investigated theoretically. A formula for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field was deduced to estimate the influence of the homogeneity of heavy-ion irradiation field on the survival fraction of cells. The results show that the survival fraction of cells irradiation by the un-uniform irradiation field is larger than that of cells irradiated by the uniform irradiation field, and the survival fraction of cells increases as the homogeneity of heavy-ion irradiation field decreasing. Practically, the heavy-ion irradiation field can be treated as uniform irradiation field when its homogeneity is better than 95%. According to these results, design request for the homogeneity of heavy-ion irradiation field should be better than 95%. The present results also show that the agreement of homogeneity of heavy-ion irradiation field must be checked while comparing the survival fraction curves obtained by different laboratory

  9. Advanced Analysis Methods in High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  10. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    Science.gov (United States)

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  11. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    Directory of Open Access Journals (Sweden)

    Liang Tang

    Full Text Available Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  12. SERPENT Monte Carlo reactor physics code

    International Nuclear Information System (INIS)

    Leppaenen, J.

    2010-01-01

    SERPENT is a three-dimensional continuous-energy Monte Carlo reactor physics burnup calculation code, developed at VTT Technical Research Centre of Finland since 2004. The code is specialized in lattice physics applications, but the universe-based geometry description allows transport simulation to be carried out in complicated three-dimensional geometries as well. The suggested applications of SERPENT include generation of homogenized multi-group constants for deterministic reactor simulator calculations, fuel cycle studies involving detailed assembly-level burnup calculations, validation of deterministic lattice transport codes, research reactor applications, educational purposes and demonstration of reactor physics phenomena. The Serpent code has been publicly distributed by the OECD/NEA Data Bank since May 2009 and RSICC in the U. S. since March 2010. The code is being used in some 35 organizations in 20 countries around the world. This paper presents an overview of the methods and capabilities of the Serpent code, with examples in the modelling of WWER-440 reactor physics. (Author)

  13. Two-dimensional Haar wavelet Collocation Method for the solution of Stationary Neutron Transport Equation in a homogeneous isotropic medium

    International Nuclear Information System (INIS)

    Patra, A.; Saha Ray, S.

    2014-01-01

    Highlights: • A stationary transport equation has been solved using the technique of Haar wavelet Collocation Method. • This paper intends to provide the great utility of Haar wavelets to nuclear science problem. • In the present paper, two-dimensional Haar wavelets are applied. • The proposed method is mathematically very simple, easy and fast. - Abstract: This paper emphasizes on finding the solution for a stationary transport equation using the technique of Haar wavelet Collocation Method (HWCM). Haar wavelet Collocation Method is efficient and powerful in solving wide class of linear and nonlinear differential equations. Recently Haar wavelet transform has gained the reputation of being a very effective tool for many practical applications. This paper intends to provide the great utility of Haar wavelets to nuclear science problem. In the present paper, two-dimensional Haar wavelets are applied for solution of the stationary Neutron Transport Equation in homogeneous isotropic medium. The proposed method is mathematically very simple, easy and fast. To demonstrate about the efficiency of the method, one test problem is discussed. It can be observed from the computational simulation that the numerical approximate solution is much closer to the exact solution

  14. String pair production in non homogeneous backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, S. [Department of Physics “E. Fermi” University of Pisa, and INFN - Sezione di Pisa,Largo Pontecorvo, 3, Ed. C, 56127 Pisa (Italy); Rabinovici, E. [Racah Institute of Physics, The Hebrew University of Jerusalem,91904 Jerusalem (Israel); Tallarita, G. [Departamento de Ciencias, Facultad de Artes Liberales,Universidad Adolfo Ibáñez, Santiago 7941169 (Chile)

    2016-04-28

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  15. String pair production in non homogeneous backgrounds

    International Nuclear Information System (INIS)

    Bolognesi, S.; Rabinovici, E.; Tallarita, G.

    2016-01-01

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  16. Homogeneous M2 duals

    International Nuclear Information System (INIS)

    Figueroa-O’Farrill, José; Ungureanu, Mara

    2016-01-01

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS 4 ×P 7 , with P riemannian and homogeneous under the action of SO(5), or S 4 ×Q 7 with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  17. Homogeneous M2 duals

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa-O’Farrill, José [School of Mathematics and Maxwell Institute for Mathematical Sciences,The University of Edinburgh,James Clerk Maxwell Building, The King’s Buildings, Peter Guthrie Tait Road,Edinburgh EH9 3FD, Scotland (United Kingdom); Ungureanu, Mara [Humboldt-Universität zu Berlin, Institut für Mathematik,Unter den Linden 6, 10099 Berlin (Germany)

    2016-01-25

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS{sub 4}×P{sup 7}, with P riemannian and homogeneous under the action of SO(5), or S{sup 4}×Q{sup 7} with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  18. Mathematical methods for physical and analytical chemistry

    CERN Document Server

    Goodson, David Z

    2011-01-01

    Mathematical Methods for Physical and Analytical Chemistry presents mathematical and statistical methods to students of chemistry at the intermediate, post-calculus level. The content includes a review of general calculus; a review of numerical techniques often omitted from calculus courses, such as cubic splines and Newton's method; a detailed treatment of statistical methods for experimental data analysis; complex numbers; extrapolation; linear algebra; and differential equations. With numerous example problems and helpful anecdotes, this text gives chemistry students the mathematical

  19. A circuit design for front-end read-out electronics of beam homogeneity measurement

    International Nuclear Information System (INIS)

    She Qianshun; Su Hong; Xu Zhiguo; Ma Xiaoli; Hu Zhengguo; Mao Ruishi; Xu Hushan

    2011-01-01

    It introduces a circuit design of beam homogeneity measurement for heavy ion beam in the monitoring needs, which convert multichannel weak current from 10 pA to 100 nA of the output of parallel plate avalanche counter (PPAC) for large area with sensitive two-dimensional position to voltage signal from -2 V to -20 mV by current-voltage-converter (IVC) circuit which composed of T-feedback resistor networks, combined with data acquisition and processing system realized the beam homogeneity measurement in heavy ion tumor therapy of the Institute of Modern Physics. Experiments have shown that the circuit with speed and high precision. This circuit can be used for read-out of the beam for the Multiwire Proportional Chamber, Faraday Cup and other weak current sources. (authors)

  20. Method of producing homogeneous mixed metal oxides and metal--metal oxide mixtures

    International Nuclear Information System (INIS)

    Quinby, T.C.

    1978-01-01

    Metal powders, metal oxide powders, and mixtures thereof of controlled particle size are provided by reacting an aqueous solution containing dissolved metal values with excess urea. Upon heating, urea reacts with water from the solution to leave a molten urea solution containing the metal values. The molten urea solution is heated to above about 180 0 C, whereupon metal values precipitate homogeneously as a powder. The powder is reduced to metal or calcined to form oxide particles. One or more metal oxides in a mixture can be selectively reduced to produce metal particles or a mixture of metal and metal oxide particles

  1. Non-periodic homogenization of 3-D elastic media for the seismic wave equation

    Science.gov (United States)

    Cupillard, Paul; Capdeville, Yann

    2018-05-01

    Because seismic waves have a limited frequency spectrum, the velocity structure of the Earth that can be extracted from seismic records has a limited resolution. As a consequence, one obtains smooth images from waveform inversion, although the Earth holds discontinuities and small scales of various natures. Within the last decade, the non-periodic homogenization method shed light on how seismic waves interact with small geological heterogeneities and `see' upscaled properties. This theory enables us to compute long-wave equivalent density and elastic coefficients of any media, with no constraint on the size, the shape and the contrast of the heterogeneities. In particular, the homogenization leads to the apparent, structure-induced anisotropy. In this paper, we implement this method in 3-D and show 3-D tests for the very first time. The non-periodic homogenization relies on an asymptotic expansion of the displacement and the stress involved in the elastic wave equation. Limiting ourselves to the order 0, we show that the practical computation of an upscaled elastic tensor basically requires (i) to solve an elastostatic problem and (ii) to low-pass filter the strain and the stress associated with the obtained solution. The elastostatic problem consists in finding the displacements due to local unit strains acting in all directions within the medium to upscale. This is solved using a parallel, highly optimized finite-element code. As for the filtering, we rely on the finite-element quadrature to perform the convolution in the space domain. We end up with an efficient numerical tool that we apply on various 3-D models to test the accuracy and the benefit of the homogenization. In the case of a finely layered model, our method agrees with results derived from Backus. In a more challenging model composed by a million of small cubes, waveforms computed in the homogenized medium fit reference waveforms very well. Both direct phases and complex diffracted waves are

  2. Two-Dimensional Homogeneous Fermi Gases

    Science.gov (United States)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  3. Noise Estimation for Single-Slice Sinogram of Low-Dose X-Ray Computed Tomography Using Homogenous Patch

    Directory of Open Access Journals (Sweden)

    Zhiwu Liao

    2012-01-01

    Full Text Available We present a new method to estimate noise for a single-slice sinogram of low-dose CT based on the homogenous patches centered at a special pixel, called center point, which has the smallest variance among all sinogram pixels. The homogenous patch, composed by homogenous points, is formed by the points similar to the center point using similarity sorting, similarity decreasing searching, and variance analysis in a very large neighborhood (VLN to avoid manual selection of parameter for similarity measures.Homogenous pixels in the VLN allow us find the largest number of samples, who have the highest similarities to the center point, for noise estimation, and the noise level can be estimated according to unbiased estimation.Experimental results show that for the simulated noisy sinograms, the method proposed in this paper can obtain satisfied noise estimation results, especially for sinograms with relatively serious noises.

  4. Physical Model Method for Seismic Study of Concrete Dams

    Directory of Open Access Journals (Sweden)

    Bogdan Roşca

    2008-01-01

    Full Text Available The study of the dynamic behaviour of concrete dams by means of the physical model method is very useful to understand the failure mechanism of these structures to action of the strong earthquakes. Physical model method consists in two main processes. Firstly, a study model must be designed by a physical modeling process using the dynamic modeling theory. The result is a equations system of dimensioning the physical model. After the construction and instrumentation of the scale physical model a structural analysis based on experimental means is performed. The experimental results are gathered and are available to be analysed. Depending on the aim of the research may be designed an elastic or a failure physical model. The requirements for the elastic model construction are easier to accomplish in contrast with those required for a failure model, but the obtained results provide narrow information. In order to study the behaviour of concrete dams to strong seismic action is required the employment of failure physical models able to simulate accurately the possible opening of joint, sliding between concrete blocks and the cracking of concrete. The design relations for both elastic and failure physical models are based on dimensional analysis and consist of similitude relations among the physical quantities involved in the phenomenon. The using of physical models of great or medium dimensions as well as its instrumentation creates great advantages, but this operation involves a large amount of financial, logistic and time resources.

  5. Molecular weight enlargement : a molecular approach to continuous homogeneous catalysis

    NARCIS (Netherlands)

    Janssen, M.C.C.

    2010-01-01

    Homogeneous catalysts play an increasingly important role in organic synthesis today, because of their high activity and selectivity. Usually, precious metals are used in combination with valuable ligands and since metal prices are expected to increase further in the future, methods for their

  6. Effective Teaching Methods--Project-based Learning in Physics

    Science.gov (United States)

    Holubova, Renata

    2008-01-01

    The paper presents results of the research of new effective teaching methods in physics and science. It is found out that it is necessary to educate pre-service teachers in approaches stressing the importance of the own activity of students, in competences how to create an interdisciplinary project. Project-based physics teaching and learning…

  7. Homogeneous deuterium exchange using rhenium and platinum chloride catalysts

    International Nuclear Information System (INIS)

    Fawdry, R.M.

    1979-01-01

    Previous studies of homogeneous hydrogen isotope exchange are mostly confined to one catalyst, the tetrachloroplatinite salt. Recent reports have indicated that chloride salts of iridium and rhodium may also be homogeneous exchange catalysts similar to the tetrachloroplatinite, but with much lower activities. Exchange by these homogeneous catalysts is frequently accompanied by metal precipitation with the termination of homogeneous exchange, particularly in the case of alkane exchange. The studies presented in this thesis describe two different approaches to overcome this limitation of homogeneous hydrogen isotope exchange catalysts. The first approach was to improve the stability of an existing homogeneous catalyst and the second was to develop a new homogeneous exchange catalyst which is free of the instability limitation

  8. The homogeneous geometries of real hyperbolic space

    DEFF Research Database (Denmark)

    Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis

    We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use...... our analysis to show that the moduli space of homogeneous structures on real hyperbolic space has two connected components....

  9. Improvement of the homogeneity of atomized particles dispersed in high uranium density research reactor fuels

    International Nuclear Information System (INIS)

    Kim, Chang-Kyu; Kim, Ki-Hwan; Park, Jong-Man; Lee, Yoon-Sang; Lee, Don-Bae; Sohn, Woong-Hee; Hong, Soon-Hyung

    1998-01-01

    A study on improving the homogeneous dispersion of atomized spherical particles in fuel meats has been performed in connection with the development of high uranium density fuel. In comparing various mixing methods, the better homogeneity of the mixture could be obtained as in order of Spex mill, V-shape tumbler mixer, and off-axis rotating drum mixer. The Spex mill mixer required some laborious work because of its small capacity per batch. Trough optimizing the rotating speed parameter for the V-shape tumbler mixer, almost the same homogeneity as with the Spex mill could be obtained. The homogeneity of the extruded fuel meats appeared to improve through extrusion. All extruded fuel meats with U 3 Si powder of 50-volume % had fairly smooth surfaces. The homogeneity of fuel meats by V-shaped tumbler mixer revealed to be fairly good on micrographs. (author)

  10. Classical and quantum treatments of the diffraction problem in the case of non-homogeneous media

    International Nuclear Information System (INIS)

    Datzeff, A.B.

    1978-02-01

    The diffraction of waves by an aperture is usually studied in the case of a homogeneous medium. In this paper, a method is proposed for the solution of the same problem in a medium of variable parameters (refractive index, external fields). It is successfully applied to diffraction of a classical scalar wave as well as of an electromagnetic vector wave and a Schroedinger wave, within the framework of this method, the scattering of particles may be considered as a particular case of the diffraction problem. Furthermore, the method is extended to cover the case of diffraction of dense electron beams. This has been achieved by means of a non-linear integro-differential equation, proposed by the author as a generalization of the well-known linear Schroedinger equation. A decisive experiment could be made which, besides showing whether the solution thus obtained is true, would also speak in favour of one of the two equations mentioned above. The latter is pertinent to the discussion of the physical essence of Quantum Mechanics

  11. An entrepreneurial physics method and its experimental test

    Science.gov (United States)

    Brown, Robert

    2012-02-01

    As faculty in a master's program for entrepreneurial physics and in an applied physics PhD program, I have advised upwards of 40 master and doctoral theses in industrial physics. I have been closely involved with four robust start-up manufacturing companies focused on physics high-technology and I have spent 30 years collaborating with industrial physicists on research and development. Thus I am in a position to reflect on many articles and advice columns centered on entrepreneurship. What about the goals, strategies, resources, skills, and the 10,000 hours needed to be an entrepreneur? What about business plans, partners, financing, patents, networking, salesmanship and regulatory affairs? What about learning new technology, how to solve problems and, in fact, learning innovation itself? At this point, I have my own method to propose to physicists in academia for incorporating entrepreneurship into their research lives. With this method, we do not start with a major invention or discovery, or even with a search for one. The method is based on the training we have, and the teaching we do (even quantum electrodynamics!), as physicists. It is based on the networking we build by 1) providing courses of continuing education for people working in industry and 2) through our undergraduate as well as graduate students who have gone on to work in industry. In fact, if we were to be limited to two words to describe the method, they are ``former students.'' Data from local and international medical imaging manufacturing industry are presented.

  12. Pellet pestle homogenization of agarose gel slices at 45 degrees C for deoxyribonucleic acid extraction.

    Science.gov (United States)

    Kurien, B T; Kaufman, K M; Harley, J B; Scofield, R H

    2001-09-15

    A simple method for extracting DNA from agarose gel slices is described. The extraction is rapid and does not involve harsh chemicals or sophisticated equipment. The method involves homogenization of the excised gel slice (in Tris-EDTA buffer), containing the DNA fragment of interest, at 45 degrees C in a microcentrifuge tube with a Kontes pellet pestle for 1 min. The "homogenate" is then centrifuged for 30 s and the supernatant is saved. The "homogenized" agarose is extracted one more time and the supernatant obtained is combined with the previous supernatant. The DNA extracted using this method lent itself to restriction enzyme analysis, ligation, transformation, and expression of functional protein in bacteria. This method was found to be applicable with 0.8, 1.0, and 2.0% agarose gels. DNA fragments varying from 23 to 0.4 kb were extracted using this procedure and a yield ranging from 40 to 90% was obtained. The yield was higher for fragments 2.0 kb and higher (70-90%). This range of efficiency was maintained when the starting material was kept between 10 and 300 ng. The heat step was found to be critical since homogenization at room temperature failed to yield any DNA. Extracting DNA with our method elicited an increased yield (up to twofold) compared with that extracted with a commercial kit. Also, the number of transformants obtained using the DNA extracted with our method was at least twice that obtained using the DNA extracted with the commercial kit. Copyright 2001 Academic Press.

  13. Methods of teaching the physics of climate change in undergraduate physics courses

    Science.gov (United States)

    Sadler, Michael

    2015-04-01

    Although anthropogenic climate change is generally accepted in the scientific community, there is considerable skepticism among the general population and, therefore, in undergraduate students of all majors. Students are often asked by their peers, family members, and others, whether they ``believe'' climate change is occurring and what should be done about it (if anything). I will present my experiences and recommendations for teaching the physics of climate change to both physics and non-science majors. For non-science majors, the basic approach is to try to develop an appreciation for the scientific method (particularly peer-reviewed research) in a course on energy and the environment. For physics majors, the pertinent material is normally covered in their undergraduate courses in modern physics and thermodynamics. Nevertheless, it helps to review the basics, e.g. introductory quantum mechanics (discrete energy levels of atomic systems), molecular spectroscopy, and blackbody radiation. I have done this in a separate elective topics course, titled ``Physics of Climate Change,'' to help the students see how their knowledge gives them insight into a topic that is very volatile (socially and politically).

  14. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  15. Pulsed-laser time-resolved thermal mirror technique in low-absorbance homogeneous linear elastic materials.

    Science.gov (United States)

    Lukasievicz, Gustavo V B; Astrath, Nelson G C; Malacarne, Luis C; Herculano, Leandro S; Zanuto, Vitor S; Baesso, Mauro L; Bialkowski, Stephen E

    2013-10-01

    A theoretical model for a time-resolved photothermal mirror technique using pulsed-laser excitation was developed for low absorption samples. Analytical solutions to the temperature and thermoelastic deformation equations are found for three characteristic pulse profiles and are compared to finite element analysis methods results for finite samples. An analytical expression for the intensity of the center of a continuous probe laser at the detector plane is derived using the Fresnel diffraction theory, which allows modeling of experimental results. Experiments are performed in optical glasses, and the models are fitted to the data. The parameters of the fit are in good agreement with previous literature data for absorption, thermal diffusion, and thermal expansion of the materials tested. The combined modeling and experimental techniques are shown to be useful for quantitative determination of the physical properties of low absorption homogeneous linear elastic material samples.

  16. Homogeneity and scale testing of generalized gamma distribution

    International Nuclear Information System (INIS)

    Stehlik, Milan

    2008-01-01

    The aim of this paper is to derive the exact distributions of the likelihood ratio tests of homogeneity and scale hypothesis when the observations are generalized gamma distributed. The special cases of exponential, Rayleigh, Weibull or gamma distributed observations are discussed exclusively. The photoemulsion experiment analysis and scale test with missing time-to-failure observations are present to illustrate the applications of methods discussed

  17. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  18. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with

  19. Using Floquet periodicity to easily calculate dispersion curves and wave structures of homogeneous waveguides

    Science.gov (United States)

    Hakoda, Christopher; Rose, Joseph; Shokouhi, Parisa; Lissenden, Clifford

    2018-04-01

    Dispersion curves are essential to any guided-wave-related project. The Semi-Analytical Finite Element (SAFE) method has become the conventional way to compute dispersion curves for homogeneous waveguides. However, only recently has a general SAFE formulation for commercial and open-source software become available, meaning that until now SAFE analyses have been variable and more time consuming than desirable. Likewise, the Floquet boundary conditions enable analysis of waveguides with periodicity and have been an integral part of the development of metamaterials. In fact, we have found the use of Floquet boundary conditions to be an extremely powerful tool for homogeneous waveguides, too. The nuances of using periodic boundary conditions for homogeneous waveguides that do not exhibit periodicity are discussed. Comparisons between this method and SAFE are made for selected homogeneous waveguide applications. The COMSOL Multiphysics software is used for the results shown, but any standard finite element software that can implement Floquet periodicity (user-defined or built-in) should suffice. Finally, we identify a number of complex waveguides for which dispersion curves can be found with relative ease by using the periodicity inherent to the Floquet boundary conditions.

  20. Classical entropy generation analysis in cooled homogenous and functionally graded material slabs with variation of internal heat generation with temperature, and convective–radiative boundary conditions

    International Nuclear Information System (INIS)

    Torabi, Mohsen; Zhang, Kaili

    2014-01-01

    This article investigates the classical entropy generation in cooled slabs. Two types of materials are assumed for the slab: homogeneous material and FGM (functionally graded material). For the homogeneous material, the thermal conductivity is assumed to be a linear function of temperature, while for the FGM slab the thermal conductivity is modeled to vary in accordance with the rule of mixtures. The boundary conditions are assumed to be convective and radiative concurrently, and the internal heat generation of the slab is a linear function of temperature. Using the DTM (differential transformation method) and resultant temperature fields from the DTM, the local and total entropy generation rates within slabs are derived. The effects of physically applicable parameters such as the thermal conductivity parameter for the homogenous slab, β, the thermal conductivity parameter for the FGM slab, γ, gradient index, j, internal heat generation parameter, Q, Biot number at the right side, Nc 2 , conduction–radiation parameter, Nr 2 , dimensionless convection sink temperature, δ, and dimensionless radiation sink temperature, η, on the local and total entropy generation rates are illustrated and explained. The results demonstrate that considering temperature- or coordinate-dependent thermal conductivity and radiation heat transfer at both sides of the slab have great effects on the entropy generation. - Highlights: • The paper investigates entropy generation in a slab due to heat generation and convective–radiative boundary conditions. • Both homogeneous material and FGM (functionally graded material) were considered. • The calculations are carried out using the differential transformation method which is a well-tested analytical technique

  1. Outcome of homogeneous and heterogeneous reactions in Darcy-Forchheimer flow with nonlinear thermal radiation and convective condition

    Directory of Open Access Journals (Sweden)

    T. Hayat

    Full Text Available The present analysis aims to report the consequences of nonlinear radiation, convective condition and heterogeneous-homogeneous reactions in Darcy-Forchheimer flow over a non-linear stretching sheet with variable thickness. Non-uniform magnetic field and nonuniform heat generation/absorption are accounted. The governing boundary layer partial differential equations are converted into a system of nonlinear ordinary differential equations. The computations are organized and the effects of physical variables such as thickness parameter, power index, Hartman number, inertia and porous parameters, radiation parameter, Biot number, Prandtl number, ratio parameter, heat generation parameter and homogeneous-heterogeneous reaction parameter are investigated. The variations of skin friction coefficient and Nusselt number for different interesting variables are plotted and discussed. It is noticed that Biot number and heat generation variable lead to enhance the temperature distribution. The solutal boundary layer thickness decreases for larger homogeneous variable while reverse trend is seen for heterogeneous reaction. Keywords: Variable sheet thickness, Darcy-Forchheimer flow, Homogeneous-heterogeneous reactions, Power-law surface velocity, Convective condition, Heat generation/absorption, Nonlinear radiation

  2. Lipid peroxidation in liver homogenates. Effects of membrane lipid composition and irradiation

    International Nuclear Information System (INIS)

    Vaca, C.; Ringdahl, M.H.

    1984-01-01

    The rate of lipid peroxidation has been followed in whole liver homogenates from mice using the TBA-method. Liver homogenates with different membrane fatty acid composition were obtained from mice fed diets containing different sources of fat i.e. sunflower seed oil (S), coconut oil (C) and hydrogenated lard (L). The yields of the TBA-chromophore (TBA-c) were 4 times higher in the liver homogenates S compared to C and L after 4 hour incubation at 37 0 C. Irradiation of the liver homogenates before incubation inhibited the formation of lipid peroxidation products in a dose dependent way. The catalytic capacity of the homogenates was investigated, followed as the autooxidation of cysteamine or modified by addition of the metal chelator EDTA. The rate of autooxidation of cysteamine, which is dependent on the presence of metal ions (Fe/sup 2+/ or Cu/sup 2+/), was decreased with increasing dose, thus indicating an alteration in the availability of metal catalysts in the system. The addition of Fe/sup 2+/ to the system restored the lipid peroxidation yields in the irradiated systems and the presence of EDTA inhibited the formation of lipid peroxidation products in all three dietary groups. It is suggested that irradiation alters the catalytic activity needed in the autooxidation processes of polyunsaturated fatty acids

  3. High Shear Homogenization of Lignin to Nanolignin and Thermal Stability of Nanolignin-Polyvinyl Alcohol Blends

    Science.gov (United States)

    Sandeep S. Nair; Sudhir Sharma; Yunqiao Pu; Qining Sun; Shaobo Pan; J.Y. Zhu; Yulin Deng; Art J. Ragauskas

    2014-01-01

    A new method to prepare nanolignin using a simple high shear homogenizer is presented. The kraft lignin particles with a broad distribution ranging from large micron- to nano-sized particles were completely homogenized to nanolignin particles with sizes less than 100 nm after 4 h of mechanical shearing. The 13C nuclear magnetic resonance (NMR)...

  4. Multiscale iterative methods, coarse level operator construction and discrete homogenization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Griebel, M. [Technische Universitaet Muenchen (Germany)

    1996-12-31

    For problems which model locally strong varying phenomena on a micro-scale level, the grid for numerical simulation can not be chosen sufficiently fine enough due to reasons of storage requirements and numerical complexity. A typical example for such kind of a problem is the diffusion equation with strongly varying diffusion coefficients as it arises as Darcy law in reservoir simulation and related problems for flow in porous media. Therefore, on the macro-scale level, it is necessary to work with averaged equations which describe directly the large-scale behavior of the problem under consideration. In the numerical simulation of reservoir performance this is achieved e.g. by renormalization or homogenization, as simpler approaches like the arithmetic, geometric or harmonic mean turn out to be invalid for systems with strong permeability variations.

  5. Numerical methods in physical and economic sciences

    International Nuclear Information System (INIS)

    Lions, J.L.; Marchouk, G.I.

    1974-01-01

    This book is the first of a series to be published simultaneously in French and Russian. Some results obtained in the framework of an agreement of French-Soviet scientific collaboration in the field of the information processing are exposed. In the first part, the iterative methods for solving linear systems are studied with new methods which are compared to already known methods. Iterative methods of minimization of quadratic functionals are then studied. In the second part, the optimization problems with one or many criteria, issued from Physics and Economics problems are considered and splitting and decentralizing methods systematically studied [fr

  6. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  7. Review of multi-physics temporal coupling methods for analysis of nuclear reactors

    International Nuclear Information System (INIS)

    Zerkak, Omar; Kozlowski, Tomasz; Gajev, Ivan

    2015-01-01

    Highlights: • Review of the numerical methods used for the multi-physics temporal coupling. • Review of high-order improvements to the Operator Splitting coupling method. • Analysis of truncation error due to the temporal coupling. • Recommendations on best-practice approaches for multi-physics temporal coupling. - Abstract: The advanced numerical simulation of a realistic physical system typically involves multi-physics problem. For example, analysis of a LWR core involves the intricate simulation of neutron production and transport, heat transfer throughout the structures of the system and the flowing, possibly two-phase, coolant. Such analysis involves the dynamic coupling of multiple simulation codes, each one devoted to the solving of one of the coupled physics. Multiple temporal coupling methods exist, yet the accuracy of such coupling is generally driven by the least accurate numerical scheme. The goal of this paper is to review in detail the approaches and numerical methods that can be used for the multi-physics temporal coupling, including a comprehensive discussion of the issues associated with the temporal coupling, and define approaches that can be used to perform multi-physics analysis. The paper is not limited to any particular multi-physics process or situation, but is intended to provide a generic description of multi-physics temporal coupling schemes for any development stage of the individual (single-physics) tools and methods. This includes a wide spectrum of situation, where the individual (single-physics) solvers are based on pre-existing computation codes embedded as individual components, or a new development where the temporal coupling can be developed and implemented as a part of code development. The discussed coupling methods are demonstrated in the framework of LWR core analysis

  8. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  9. Academic Training Lecture: Statistical Methods for Particle Physics

    CERN Multimedia

    PH Department

    2012-01-01

    2, 3, 4 and 5 April 2012 Academic Training Lecture  Regular Programme from 11:00 to 12:00 -  Bldg. 222-R-001 - Filtration Plant Statistical Methods for Particle Physics by Glen Cowan (Royal Holloway) The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena.  Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties.  The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  10. Two-scale homogenization to determine effective parameters of thin metallic-structured films

    Science.gov (United States)

    Marigo, Jean-Jacques

    2016-01-01

    We present a homogenization method based on matched asymptotic expansion technique to derive effective transmission conditions of thin structured films. The method leads unambiguously to effective parameters of the interface which define jump conditions or boundary conditions at an equivalent zero thickness interface. The homogenized interface model is presented in the context of electromagnetic waves for metallic inclusions associated with Neumann or Dirichlet boundary conditions for transverse electric or transverse magnetic wave polarization. By comparison with full-wave simulations, the model is shown to be valid for thin interfaces up to thicknesses close to the wavelength. We also compare our effective conditions with the two-sided impedance conditions obtained in transmission line theory and to the so-called generalized sheet transition conditions. PMID:27616916

  11. HOMPRA Europe - A gridded precipitation data set from European homogenized time series

    Science.gov (United States)

    Rustemeier, Elke; Kapala, Alice; Meyer-Christoffer, Anja; Finger, Peter; Schneider, Udo; Venema, Victor; Ziese, Markus; Simmer, Clemens; Becker, Andreas

    2017-04-01

    Reliable monitoring data are essential for robust analyses of climate variability and, in particular, long-term trends. In this regard, a gridded, homogenized data set of monthly precipitation totals - HOMPRA Europe (HOMogenized PRecipitation Analysis of European in-situ data)- is presented. The data base consists of 5373 homogenized monthly time series, a carefully selected subset held by the Global Precipitation Climatology Centre (GPCC). The chosen series cover the period 1951-2005 and contain less than 10% missing values. Due to the large number of data, an automatic algorithm had to be developed for the homogenization of these precipitation series. In principal, the algorithm is based on three steps: * Selection of overlapping station networks in the same precipitation regime, based on rank correlation and Ward's method of minimal variance. Since the underlying time series should be as homogeneous as possible, the station selection is carried out by deterministic first derivation in order to reduce artificial influences. * The natural variability and trends were temporally removed by means of highly correlated neighboring time series to detect artificial break-points in the annual totals. This ensures that only artificial changes can be detected. The method is based on the algorithm of Caussinus and Mestre (2004). * In the last step, the detected breaks are corrected monthly by means of a multiple linear regression (Mestre, 2003). Due to the automation of the homogenization, the validation of the algorithm is essential. Therefore, the method was tested on artificial data sets. Additionally the sensitivity of the method was tested by varying the neighborhood series. If available in digitized form, the station history was also used to search for systematic errors in the jump detection. Finally, the actual HOMPRA Europe product is produced by interpolation of the homogenized series onto a 1° grid using one of the interpolation schems operationally at GPCC

  12. Homogenization of some radiative heat transfer models: application to gas-cooled reactor cores

    International Nuclear Information System (INIS)

    El Ganaoui, K.

    2006-09-01

    In the context of homogenization theory we treat some heat transfer problems involving unusual (according to the homogenization) boundary conditions. These problems are defined in a solid periodic perforated domain where two scales (macroscopic and microscopic) are to be taken into account and describe heat transfer by conduction in the solid and by radiation on the wall of each hole. Two kinds of radiation are considered: radiation in an infinite medium (non-linear problem) and radiation in cavity with grey-diffuse walls (non-linear and non-local problem). The derived homogenized models are conduction problems with an effective conductivity which depend on the considered radiation. Thus we introduce a framework (homogenization and validation) based on mathematical justification using the two-scale convergence method and numerical validation by simulations using the computer code CAST3M. This study, performed for gas cooled reactors cores, can be extended to other perforated domains involving the considered heat transfer phenomena. (author)

  13. A personal view on homogenization

    International Nuclear Information System (INIS)

    Tartar, L.

    1987-02-01

    The evolution of some ideas is first described. Under the name homogenization are collected all the mathematical results who help understanding the relations between the microstructure of a material and its macroscopic properties. Homogenization results are given through a critically detailed bibliography. The mathematical models given are systems of partial differential equations, supposed to describe some properties at a scale ε and we want to understand what will happen to the solutions if ε tends to 0

  14. WHAMP - waves in homogeneous, anisotropic, multicomponent plasmas

    International Nuclear Information System (INIS)

    Roennmark, K.

    1982-06-01

    In this report, a computer program which solves the dispersion relation of waves in a magnetized plasma is described. The dielectric tensor is derived using the kinetic theory of homogeneous plasmas with Maxwellian velocity distribution. Up to six different plasma components can be included in this version of the program, and each component is specified by its density, temperature, particle mass, anisotropy and drift velocity along the magnetic field. The program is thus applicable to a very wide class of plasmas, and the method should in general be useful whenever a homogeneous magnetized plasma can be approximated by a linear combination of Maxwellian components. The general theory underlying the program is outlined. It is shown that by introducing a Pade approximant for the plasma dispersion function Z, the infinite sums of modified Bessel functions which appear in the dielectric tensor may be reduced to a summable form. The Pade approximant is derived and the accuracy of the approximation is also discussed. The subroutines making up the program are described. (Author)

  15. The zeroth law in quasi-homogeneous thermodynamics and black holes

    Directory of Open Access Journals (Sweden)

    Alessandro Bravetti

    2017-11-01

    Full Text Available Motivated by black holes thermodynamics, we consider the zeroth law of thermodynamics for systems whose entropy is a quasi-homogeneous function of the extensive variables. We show that the generalized Gibbs–Duhem identity and the Maxwell construction for phase coexistence based on the standard zeroth law are incompatible in this case. We argue that the generalized Gibbs–Duhem identity suggests a revision of the zeroth law which in turns permits to reconsider Maxwell's construction in analogy with the standard case. The physical feasibility of our proposal is considered in the particular case of black holes.

  16. Homogenization of variational inequalities for obstacle problems

    International Nuclear Information System (INIS)

    Sandrakov, G V

    2005-01-01

    Results on the convergence of solutions of variational inequalities for obstacle problems are proved. The variational inequalities are defined by a non-linear monotone operator of the second order with periodic rapidly oscillating coefficients and a sequence of functions characterizing the obstacles. Two-scale and macroscale (homogenized) limiting variational inequalities are obtained. Derivation methods for such inequalities are presented. Connections between the limiting variational inequalities and two-scale and macroscale minimization problems are established in the case of potential operators.

  17. XXXIV Bialowieza Workshop on Geometric Methods in Physics

    CERN Document Server

    Ali, S; Bieliavsky, Pierre; Odzijewicz, Anatol; Schlichenmaier, Martin; Voronov, Theodore

    2016-01-01

    This book features a selection of articles based on the XXXIV Białowieża Workshop on Geometric Methods in Physics, 2015. The articles presented are mathematically rigorous, include important physical implications and address the application of geometry in classical and quantum physics. Special attention deserves the session devoted to discussions of Gerard Emch's most important and lasting achievements in mathematical physics. The Białowieża workshops are among the most important meetings in the field and gather participants from mathematics and physics alike. Despite their long tradition, the Workshops remain at the cutting edge of ongoing research. For the past several years, the Białowieża Workshop has been followed by a School on Geometry and Physics, where advanced lectures for graduate students and young researchers are presented. The unique atmosphere of the Workshop and School is enhanced by the venue, framed by the natural beauty of the Białowieża forest in eastern Poland.

  18. Stepping out of homogeneity in loop quantum cosmology

    International Nuclear Information System (INIS)

    Rovelli, Carlo; Vidotto, Francesca

    2008-01-01

    We explore the extension of quantum cosmology outside the homogeneous approximation using the formalism of loop quantum gravity. We introduce a model where some of the inhomogeneous degrees of freedom are present, providing a tool for describing general fluctuations of quantum geometry near the initial singularity. We show that the dynamical structure of the model reduces to that of loop quantum cosmology in the Born-Oppenheimer approximation. This result corroborates the assumptions that ground loop cosmology sheds some light on the physical and mathematical relation between loop cosmology and full loop quantum gravity, and on the nature of the cosmological approximation. Finally, we show that the non-graph-changing Hamiltonian constraint considered in the context of algebraic quantum gravity provides a viable effective dynamics within this approximation

  19. Advanced homogenization strategies in material modeling of thermally sprayed TBCs

    International Nuclear Information System (INIS)

    Bobzin, K.; Lugscheider, E.; Nickel, R.; Kashko, T.

    2006-01-01

    Thermal barrier coatings (TBC), obtained by atmospheric plasma spraying (APS), have a complex microstructure (lamellar, porous, micro-cracked). Process parameters take an influence on this microstructure. Two methods based on the homogenization for periodic structures are presented in this article. The methods are used to calculate the effective material behavior of APS-TBCs made of partially yttria stabilized zirconia (PYSZ) depending on the microstructure. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  20. Physical acoustics principles and methods

    CERN Document Server

    Mason, Warren P

    2012-01-01

    Physical Acoustics: Principles and Methods, Volume IV, Part B: Applications to Quantum and Solid State Physics provides an introduction to the various applications of quantum mechanics to acoustics by describing several processes for which such considerations are essential. This book discusses the transmission of sound waves in molten metals. Comprised of seven chapters, this volume starts with an overview of the interactions that can happen between electrons and acoustic waves when magnetic fields are present. This text then describes acoustic and plasma waves in ionized gases wherein oscillations are subject to hydrodynamic as well as electromagnetic forces. Other chapters examine the resonances and relaxations that can take place in polymer systems. This book discusses as well the general theory of the interaction of a weak sinusoidal field with matter. The final chapter describes the sound velocities in the rocks composing the Earth. This book is a valuable resource for physicists and engineers.

  1. Particle identification methods in High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Va' Vra, J.

    2000-01-27

    This paper deals with two major particle identification methods: dE/dx and Cherenkov detection. In the first method, the authors systematically compare existing dE/dx data with various predictions available in the literature, such as the Particle Data group recommendation, and judge the overall consistency. To my knowledge, such comparison was not done yet in a published form for the gaseous detectors used in High-Energy physics. As far as the second method, there are two major Cherenkov light detection techniques: the threshold and the Ring imaging methods. The authors discuss the recent trend in these techniques.

  2. Generation of exact solutions to the Einstein field equations for homogeneous space--time

    International Nuclear Information System (INIS)

    Hiromoto, R.E.

    1978-01-01

    A formalism is presented capable of finding all homogeneous solutions of the Einstein field equations with an arbitrary energy-stress tensor. Briefly the method involves the classification of the four-dimensional Lie algebra over the reals into nine different broad classes, using only the Lorentz group. Normally the classification of Lie algebras means that one finds all essentially different solutions of the Jacobi identities, i.e., there exists no nonsingular linear transformation which transforms two sets of structure constants into the other. This approach is to utilize the geometrical considerations of the homogeneous spacetime and field equations to be solved. Since the set of orthonormal basis vectors is not only endowed with a Minkowskian metric, but also constitutes the vector space of our four-dimensional Lie algebras, the Lie algebras are classified against the Lorentz group restricts the linear group of transformations, denoting the essentially different Lie algebras, into nine different broad classes. The classification of the four-dimensional Lie algebras represents the unification of various methods previously introduced by others. Where their methods found only specific solutions to the Einstein field equations, systematic application of the nine different classes of Lie algebras guarantees the extraction of all solutions. Therefore, the methods of others were extended, and their foundations of formalism which goes beyond the present literature of exact homogeneous solutions to the Einstein field equations is built upon

  3. Dissolution test for homogeneity of mixed oxide fuel pellets

    International Nuclear Information System (INIS)

    Lerch, R.E.

    1979-08-01

    Experiments were performed to determine the relationship between fuel pellet homogeneity and pellet dissolubility. Although, in general, the amount of pellet residue decreased with increased homogeneity, as measured by the pellet figure of merit, the relationship was not absolute. Thus, all pellets with high figure of merit (excellent homogeneity) do not necessarily dissolve completely and all samples that dissolve completely do not necessarily have excellent homogeneity. It was therefore concluded that pellet dissolubility measurements could not be substituted for figure of merit determinations as a measurement of pellet homogeneity. 8 figures, 3 tables

  4. International Conference on Geometric and Harmonic Analysis on Homogeneous Spaces and Applications

    CERN Document Server

    Nomura, Takaaki

    2017-01-01

    This book provides the latest competing research results on non-commutative harmonic analysis on homogeneous spaces with many applications. It also includes the most recent developments on other areas of mathematics including algebra and geometry. Lie group representation theory and harmonic analysis on Lie groups and on their homogeneous spaces form a significant and important area of mathematical research. These areas are interrelated with various other mathematical fields such as number theory, algebraic geometry, differential geometry, operator algebra, partial differential equations and mathematical physics.  Keeping up with the fast development of this exciting area of research, Ali Baklouti (University of Sfax) and Takaaki Nomura (Kyushu University) launched a series of seminars on the topic, the first of which took place on November 2009 in Kerkennah Islands, the second in Sousse  on December 2011, and the third in Hammamet& nbsp;on December 2013. The last seminar, which took place on Dece...

  5. Synthesis and characterization of retrograded starch nanoparticles through homogenization and miniemulsion cross-linking.

    Science.gov (United States)

    Ding, Yongbo; Zheng, Jiong; Zhang, Fusheng; Kan, Jianquan

    2016-10-20

    A new and convenient route to synthesizing retrograded starch nanoparticles (RS3NPs) through homogenization combined with a water-in-oil miniemulsion cross-linking technique was developed. The RS3NPs were optimized using Box-Behnken experimental design. Homogenization pressure (X1), oil/water ratio (X2), and surfactant (X3) were selected as independent variables, whereas particle size was considered as a dependent variable. Results indicated that homogenization pressure was the main contributing variable for particle size. The optimum values for homogenization pressure, oil/water ratio, and surfactant were 30MPa, 9.34:1, and 2.54g, respectively, whereas the particle size was predicted to be 288.2 nm. Morphological, physical, chemical, and functional properties of the RS3NPs were the assessed. Scanning electron microscopy and dynamic light scattering images showed that RS3NP granules were broken down to size of about 222.2nm. X-ray diffraction results revealed a disruption in crystallinity. The RS3NPs exhibited a slight decrease in To, but Tp and Tc increased and narrowest Tc-To. The solubility and swelling power were also increased. New peaks at 1594.84 and 1403.65cm(-1) were observed in the FTIR graph. However, homogenization minimally influenced the antidigestibility of RS3NPs. The absorption properties improved, and the adsorption kinetic described the contact time on the adsorption of captopril onto RS3NPs. In vitro release experiment indicated that the drug was released as follows: 21% after 2h in SGF, 42.78% at the end of 8h (2h in SGF and 6h in SIF), and 92.55% after 12h in SCF. These findings may help better utilize RS3NP in biomedical applications as a drug delivery material. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Homogenization patterns of the world's freshwater fish faunas.

    Science.gov (United States)

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-11-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the "Homogocene era" is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes.

  7. Osteoarthritic cartilage is more homogeneous than healthy cartilage

    DEFF Research Database (Denmark)

    Qazi, Arish A; Dam, Erik B; Nielsen, Mads

    2007-01-01

    it evolves as a consequence to disease and thereby can be used as a progression biomarker. MATERIALS AND METHODS: A total of 283 right and left knees from 159 subjects aged 21 to 81 years were scanned using a Turbo 3D T1 sequence on a 0.18-T MRI Esaote scanner. The medial compartment of the tibial cartilage...... sheet was segmented using a fully automatic voxel classification scheme based on supervised learning. From the segmented cartilage sheet, homogeneity was quantified by measuring entropy from the distribution of signal intensities inside the compartment. Each knee was examined by radiography...... of the region was evaluated by testing for overfitting. Three different regularization techniques were evaluated for reducing overfitting errors. RESULTS: The P values for separating the different groups based on cartilage homogeneity were 2 x 10(-5) (KL 0 versus KL 1) and 1 x 10(-7) (KL 0 versus KL >0). Using...

  8. Enthalpy recovery in glassy materials: Heterogeneous versus homogenous models

    Science.gov (United States)

    Mazinani, Shobeir K. S.; Richert, Ranko

    2012-05-01

    Models of enthalpy relaxations of glasses are the basis for understanding physical aging, scanning calorimetry, and other phenomena that involve non-equilibrium and non-linear dynamics. We compare models in terms of the nature of the relaxation dynamics, heterogeneous versus homogeneous, with focus on the Kovacs-Aklonis-Hutchinson-Ramos (KAHR) and the Tool-Narayanaswamy-Moynihan (TNM) approaches. Of particular interest is identifying the situations for which experimental data are capable of discriminating the heterogeneous from the homogeneous scenario. The ad hoc assumption of a single fictive temperature, Tf, is common to many models, including KAHR and TNM. It is shown that only for such single-Tf models, enthalpy relaxation of a glass is a two-point correlation function in reduced time, implying that experimental results are not decisive regarding the underlying nature of the dynamics of enthalpy relaxation. We also find that the restriction of the common TNM model to a Kohlrausch-Williams-Watts type relaxation pattern limits the applicability of this approach, as the particular choice regarding the distribution of relaxation times is a more critical factor compared with isothermal relaxation experiments. As a result, significant improvements in fitting calorimetry data can be achieved with subtle adjustments in the underlying relaxation time distribution.

  9. Homogen Mur - et udviklingsprojekt

    DEFF Research Database (Denmark)

    Dahl, Torben; Beim, Anne; Sørensen, Peter

    1997-01-01

    Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk.......Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk....

  10. Outcome of homogeneous and heterogeneous reactions in Darcy-Forchheimer flow with nonlinear thermal radiation and convective condition

    Science.gov (United States)

    Hayat, T.; Shah, Faisal; Alsaedi, A.; Hussain, Zakir

    The present analysis aims to report the consequences of nonlinear radiation, convective condition and heterogeneous-homogeneous reactions in Darcy-Forchheimer flow over a non-linear stretching sheet with variable thickness. Non-uniform magnetic field and nonuniform heat generation/absorption are accounted. The governing boundary layer partial differential equations are converted into a system of nonlinear ordinary differential equations. The computations are organized and the effects of physical variables such as thickness parameter, power index, Hartman number, inertia and porous parameters, radiation parameter, Biot number, Prandtl number, ratio parameter, heat generation parameter and homogeneous-heterogeneous reaction parameter are investigated. The variations of skin friction coefficient and Nusselt number for different interesting variables are plotted and discussed. It is noticed that Biot number and heat generation variable lead to enhance the temperature distribution. The solutal boundary layer thickness decreases for larger homogeneous variable while reverse trend is seen for heterogeneous reaction.

  11. Artificial dispersion via high-order homogenization: magnetoelectric coupling and magnetism from dielectric layers

    Science.gov (United States)

    Liu, Yan; Guenneau, Sébastien; Gralak, Boris

    2013-01-01

    We investigate a high-order homogenization (HOH) algorithm for periodic multi-layered stacks. The mathematical tool of choice is a transfer matrix method. Expressions for effective permeability, permittivity and magnetoelectric coupling are explored by frequency power expansions. On the physical side, this HOH uncovers a magnetoelectric coupling effect (odd-order approximation) and artificial magnetism (even-order approximation) in moderate contrast photonic crystals. Comparing the effective parameters' expressions of a stack with three layers against that of a stack with two layers, we note that the magnetoelectric coupling effect vanishes while the artificial magnetism can still be achieved in a centre-symmetric periodic structure. Furthermore, we numerically check the effective parameters through the dispersion law and transmission property of a stack with two dielectric layers against that of an effective bianisotropic medium: they are in good agreement throughout the low-frequency (acoustic) band until the first stop band, where the analyticity of the logarithm function of the transfer matrix () breaks down. PMID:24101891

  12. On superspinor structure of homogeneous superspace of orthosymplectic groups

    International Nuclear Information System (INIS)

    Volkov, D.V.; Soroka, V.A.; Tkach, V.I.

    1984-01-01

    Superspinor structure of homogeneous superspaces of orthosymplectic groups are considered. It is shown how the properties of orthosymplectic group superspaces of OSp(N, 2K) group playing an important role in the supersymmetry theory can be described using superspinors. An example confirming a possibility of the relation between . canonical ratios of Butten bracket and conventional methods of quantization is considered

  13. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    Science.gov (United States)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2018-06-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  14. Comments on Thermal Physical Properties Testing Methods of Phase Change Materials

    Directory of Open Access Journals (Sweden)

    Jingchao Xie

    2013-01-01

    Full Text Available There is no standard testing method of the thermal physical properties of phase change materials (PCM. This paper has shown advancements in this field. Developments and achievements in thermal physical properties testing methods of PCM were commented, including differential scanning calorimetry, T-history measurement, the water bath method, and differential thermal analysis. Testing principles, advantages and disadvantages, and important points for attention of each method were discussed. A foundation for standardized testing methods for PCM was made.

  15. Homogeneity evaluation of mesenchymal stem cells based on electrotaxis analysis

    OpenAIRE

    Kim, Min Sung; Lee, Mi Hee; Kwon, Byeong-Ju; Kim, Dohyun; Koo, Min-Ah; Seon, Gyeung Mi; Park, Jong-Chul

    2017-01-01

    Stem cell therapy that can restore function to damaged tissue, avoid host rejection and reduce inflammation throughout body without use of immunosuppressive drugs. The established methods were used to identify and to isolate specific stem cell markers by FACS or by immunomagnetic cell separation. The procedures for distinguishing population of stem cells took a time and needed many preparations. Here we suggest an electrotaxis analysis as a new method to evaluate the homogeneity of mesenchyma...

  16. Homogeneous Studies of Transiting Extrasolar Planets: Current Status and Future Plans

    Science.gov (United States)

    Taylor, John

    2011-09-01

    We now know of over 500 planets orbiting stars other than our Sun. The jewels in the crown are the transiting planets, for these are the only ones whose masses and radii are measurable. They are fundamental for our understanding of the formation, evolution, structure and atmospheric properties of extrasolar planets. However, their characterization is not straightforward, requiring extremely high-precision photometry and spectroscopy as well as input from theoretical stellar models. I summarize the motivation and current status of a project to measure the physical properties of all known transiting planetary systems using homogeneous techniques (Southworth 2008, 2009, 2010, 2011 in preparation). Careful attention is paid to the treatment of limb darkening, contaminating light, correlated noise, numerical integration, orbital eccentricity and orientation, systematic errors from theoretical stellar models, and empirical constraints. Complete error budgets are calculated for each system and can be used to determine which type of observation would be most useful for improving the parameter measurements. Known correlations between the orbital periods, masses, surface gravities, and equilibrium temperatures of transiting planets can be explored more safely due to the homogeneity of the properties. I give a sneak preview of Homogeneous Studies Paper 4, which includes the properties of thirty transiting planetary systems observed by the CoRoT, Kepler and Deep Impact space missions. Future opportunities are discussed, plus remaining problems with our understanding of transiting planets. I acknowledge funding from the UK STFC in the form of an Advanced Fellowship.

  17. Is it beneficial to selectively boost high-risk tumor subvolumes? A comparison of selectively boosting high-risk tumor subvolumes versus homogeneous dose escalation of the entire tumor based on equivalent EUD plans

    International Nuclear Information System (INIS)

    Kim, Yusung; To me, Wolfgang A.

    2008-01-01

    Purpose. To quantify and compare expected local tumor control and expected normal tissue toxicities between selective boosting IMRT and homogeneous dose escalation IMRT for the case of prostate cancer. Methods. Four different selective boosting scenarios and three different high-risk tumor subvolume geometries were designed to compare selective boosting and homogeneous dose escalation IMRT plans delivering the same equivalent uniform dose (EUD) to the entire PTV. For each scenario, differences in tumor control probability between both boosting strategies were calculated for the high-risk tumor subvolume and remaining low-risk PTV, and were visualized using voxel based iso-TCP maps. Differences in expected rectal and bladder complications were quantified using radiobiological indices (generalized EUD (gEUD) and normal tissue complication probability (NTCP)) as well as %-volumes. Results. For all investigated scenarios and high-risk tumor subvolume geometries, selective boosting IMRT improves expected TCP compared to homogeneous dose escalation IMRT, especially when lack of control of the high-risk tumor subvolume could be the cause for tumor recurrence. Employing, selective boosting IMRT significant increases in expected TCP can be achieved for the high-risk tumor subvolumes. The three conventional selective boosting IMRT strategies, employing physical dose objectives, did not show significant improvement in rectal and bladder sparing as compared to their counterpart homogeneous dose escalation plans. However, risk-adaptive optimization, utilizing radiobiological objective functions, resulted in reduction in NTCP for the rectum when compared to its corresponding homogeneous dose escalation plan. Conclusions. Selective boosting is a more effective method than homogeneous dose escalation for achieving optimal treatment outcomes. Furthermore, risk-adaptive optimization increases the therapeutic ratio as compared to conventional selective boosting IMRT

  18. Surface Simplification of 3D Animation Models Using Robust Homogeneous Coordinate Transformation

    Directory of Open Access Journals (Sweden)

    Juin-Ling Tseng

    2014-01-01

    Full Text Available The goal of 3D surface simplification is to reduce the storage cost of 3D models. A 3D animation model typically consists of several 3D models. Therefore, to ensure that animation models are realistic, numerous triangles are often required. However, animation models that have a high storage cost have a substantial computational cost. Hence, surface simplification methods are adopted to reduce the number of triangles and computational cost of 3D models. Quadric error metrics (QEM has recently been identified as one of the most effective methods for simplifying static models. To simplify animation models by using QEM, Mohr and Gleicher summed the QEM of all frames. However, homogeneous coordinate problems cannot be considered completely by using QEM. To resolve this problem, this paper proposes a robust homogeneous coordinate transformation that improves the animation simplification method proposed by Mohr and Gleicher. In this study, the root mean square errors of the proposed method were compared with those of the method proposed by Mohr and Gleicher, and the experimental results indicated that the proposed approach can preserve more contour features than Mohr’s method can at the same simplification ratio.

  19. Identification of Homogeneous Stations for Quality Monitoring Network of Mashhad Aquifer Based on Nitrate Pollution

    Directory of Open Access Journals (Sweden)

    Moslem Akbarzadeh

    2017-01-01

    Full Text Available Introduction: For water resources monitoring, Evaluation of groundwater quality obtained via detailed analysis of pollution data. The most fundamental analysis is to identify the exact measurement of dangerous zones and homogenous station identification in terms of pollution. In case of quality evaluation, the monitoring improvement could be achieved via identifying homogenous wells in terms of pollution. Presenting a method for clustering is essential in large amounts of quality data for aquifer monitoring and quality evaluation, including identification of homogeneous stations of monitoring network and their clustering based on pollution. In this study, with the purpose of Mashhad aquifer quality evaluation, clustering have been studied based on Euclidean distance and Entropy criteria. Cluster analysis is the task of grouping a set of objects in such a way that objects in the same group (called a cluster are more similar (in some sense or another to each other than to those in other groups (clusters. SNI as a combined entropy measure for clustering calculated from dividing mutual information of two values (pollution index values to the joint entropy. These measures apply as similar distance criteria for monitoring stations clustering. Materials and Methods: First, nitrate data (as pollution index and electrical conductivity (EC (as covariate collected from the related locational situation of 287 wells in statistical period 2002 to 2011. Having identified the outlying data and estimating non-observed points by spatial-temporal Kriging method and then standardizes them, the clustering process was carried out. A similar distance of wells calculated through a clustering process based on Euclidean distance and Entropy (SNI criteria. This difference explained by characteristics such as the location of wells (longitude & latitude and the pollution index (nitrate. Having obtained a similar distance of each well to others, the hierarchical clustering

  20. An elastoplastic homogenization procedure for predicting the settlement of a foundation on a soil reinforced by columns

    OpenAIRE

    ABDELKRIM, Malek; DE BUHAN, Patrick

    2007-01-01

    This paper presents an elastoplastic homogenization method applied to a soil reinforced by regularly distributed columns. According to this method, the composite reinforced soil is regarded, from a macroscopic point of view, as a homogeneous anisotropic continuous medium, the elastic as well as plastic properties of which can be obtained from the solution to an auxiliary problem attached to the reinforced soil representative cell. Based upon an approximate solution to this problem, in which p...

  1. Methods of Efficient Study Habits and Physics Learning

    Science.gov (United States)

    Zettili, Nouredine

    2010-02-01

    We want to discuss the methods of efficient study habits and how they can be used by students to help them improve learning physics. In particular, we deal with the most efficient techniques needed to help students improve their study skills. We focus on topics such as the skills of how to develop long term memory, how to improve concentration power, how to take class notes, how to prepare for and take exams, how to study scientific subjects such as physics. We argue that the students who conscientiously use the methods of efficient study habits achieve higher results than those students who do not; moreover, a student equipped with the proper study skills will spend much less time to learn a subject than a student who has no good study habits. The underlying issue here is not the quantity of time allocated to the study efforts by the students, but the efficiency and quality of actions so that the student can function at peak efficiency. These ideas were developed as part of Project IMPACTSEED (IMproving Physics And Chemistry Teaching in SEcondary Education), an outreach grant funded by the Alabama Commission on Higher Education. This project is motivated by a major pressing local need: A large number of high school physics teachers teach out of field. )

  2. Enzymatic production of N-acetyl-d-glucosamine from crayfish shell wastes pretreated via high pressure homogenization.

    Science.gov (United States)

    Wei, Guoguang; Zhang, Alei; Chen, Kequan; Ouyang, Pingkai

    2017-09-01

    This study presents an efficient pretreatment of crayfish shell using high pressure homogenization that enables N-acetyl-d-glucosamine (GlcNAc) production by chitinase. Firstly, the chitinase from Serratia proteamaculans NJ303 was screened for its ability to degrade crayfish shell and produce GlcNAc as the sole product. Secondly, high pressure homogenization, which caused the crayfish shell to adopt a fluffy netted structure that was characterized by Scanning electron microscope (SEM), Fourier transform infrared spectrometer (FT-IR), X-ray diffraction (XRD), was evaluated as the best pretreatment method. In addition, the optimal conditions of high pressure homogenization of crayfish shell were determined to be five cycles at a pressure of 400bar, which achieved a yield of 3.9g/L of GlcNAc from 25g/L of crayfish shell in a batch enzymatic reaction over 1.5h. The results showed high pressure homogenization might be an efficient method for direct utilization of crayfish shell for enzymatic production of GlcNAc. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Energy, ropelength, and other physical aspects of equilateral knots

    International Nuclear Information System (INIS)

    Millett, Kenneth C.; Rawdon, Eric J.

    2003-01-01

    Closed macromolecular chains may form physically knotted conformations whose relative occurrence and spatial measurements provide insight into their properties and the mechanisms acting upon them. Under the assumption of a degree of structural homogeneity, equilateral spatial polygons are a productive context within which to create mathematical models of these knots and to study their mathematical and physical properties. The ensembles, or spaces, of these knots are models of the settings within which the knots evolve in ways determined by a physical model. In this paper we describe the mathematical foundation of such models as well as such spatial, geometric, statistical, and physical properties of the configurations as mathematical energies, thickness and ropelength, average crossing number, average writhe, and volumes and surfaces areas of standard bodies enclosing the knots. We present methods with which the energy and ropelength are optimized within the families of spatially equivalent equilateral configurations. Numerical results from our implementation of these methods are shown to illustrate connections between the physical measurements and spatial characteristics of the optimized knot configurations. In addition, these data suggest potentially new connections involving their spatial properties

  4. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    Science.gov (United States)

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms.

  5. Evaluating a novel application of optical fibre evanescent field absorbance: rapid measurement of red colour in winegrape homogenates

    Science.gov (United States)

    Lye, Peter G.; Bradbury, Ronald; Lamb, David W.

    Silica optical fibres were used to measure colour (mg anthocyanin/g fresh berry weight) in samples of red wine grape homogenates via optical Fibre Evanescent Field Absorbance (FEFA). Colour measurements from 126 samples of grape homogenate were compared against the standard industry spectrophotometric reference method that involves chemical extraction and subsequent optical absorption measurements of clarified samples at 520 nm. FEFA absorbance on homogenates at 520 nm (FEFA520h) was correlated with the industry reference method measurements of colour (R2 = 0.46, n = 126). Using a simple regression equation colour could be predicted with a standard error of cross-validation (SECV) of 0.21 mg/g, with a range of 0.6 to 2.2 mg anthocyanin/g and a standard deviation of 0.33 mg/g. With a Ratio of Performance Deviation (RPD) of 1.6, the technique when utilizing only a single detection wavelength, is not robust enough to apply in a diagnostic sense, however the results do demonstrate the potential of the FEFA method as a fast and low-cost assay of colour in homogenized samples.

  6. A new numerical scheme for non uniform homogenized problems: Application to the non linear Reynolds compressible equation

    Directory of Open Access Journals (Sweden)

    Buscaglia Gustavo C.

    2001-01-01

    Full Text Available A new numerical approach is proposed to alleviate the computational cost of solving non-linear non-uniform homogenized problems. The article details the application of the proposed approach to lubrication problems with roughness effects. The method is based on a two-parameter Taylor expansion of the implicit dependence of the homogenized coefficients on the average pressure and on the local value of the air gap thickness. A fourth-order Taylor expansion provides an approximation that is accurate enough to be used in the global problem solution instead of the exact dependence, without introducing significant errors. In this way, when solving the global problem, the solution of local problems is simply replaced by the evaluation of a polynomial. Moreover, the method leads naturally to Newton-Raphson nonlinear iterations, that further reduce the cost. The overall efficiency of the numerical methodology makes it feasible to apply rigorous homogenization techniques in the analysis of compressible fluid contact considering roughness effects. Previous work makes use of an heuristic averaging technique. Numerical comparison proves that homogenization-based methods are superior when the roughness is strongly anisotropic and not aligned with the flow direction.

  7. The application of polynomial chaos methods to a point kinetics model of MIPR: An Aqueous Homogeneous Reactor

    International Nuclear Information System (INIS)

    Cooling, C.M.; Williams, M.M.R.; Nygaard, E.T.; Eaton, M.D.

    2013-01-01

    Highlights: • A point kinetics model for the Medical Isotope Production Reactor is formulated. • Reactivity insertions are simulated using this model. • Polynomial chaos is used to simulate uncertainty in reactor parameters. • The computational efficiency of polynomial chaos is compared to that of Monte Carlo. -- Abstract: This paper models a conceptual Medical Isotope Production Reactor (MIPR) using a point kinetics model which is used to explore power excursions in the event of a reactivity insertion. The effect of uncertainty of key parameters is modelled using intrusive polynomial chaos. It is found that the system is stable against reactivity insertions and power excursions are all bounded and tend towards a new equilibrium state due to the negative feedbacks inherent in Aqueous Homogeneous Reactors (AHRs). The Polynomial Chaos Expansion (PCE) method is found to be much more computationally efficient than that of Monte Carlo simulation in this application

  8. Built environment and physical activity: a brief review of evaluation methods

    Directory of Open Access Journals (Sweden)

    Adriano Akira Ferreira Hino

    2010-08-01

    Full Text Available There is strong evidence indicating that the environment where people live has amarked influence on physical activity. The current understanding of this relationship is basedon studies conducted in developed and culturally distinct countries and may not be applicableto the context of Brazil. In this respect, a better understanding of methods evaluating the relationshipbetween the environment and physical activity may contribute to the development ofnew studies in this area in Brazil. The objective of the present study was to briefly describe themain methods used to assess the relationship between built environment and physical activity.Three main approaches are used to obtain information about the environment: 1 environmentalperception; 2 systematic observation, and 3 geoprocessing. These methods are mainly applied toevaluate population density, mixed land use, physical activity facilities, street patterns, sidewalk/bike path coverage, public transportation, and safety/esthetics. In Brazil, studies investigating therelationship between the environment and physical activity are scarce, but the number of studiesis growing. Thus, further studies are necessary and methods applicable to the context of Brazilneed to be developed in order to increase the understanding of this subject.

  9. Linking biotic homogenization to habitat type, invasiveness and growth form of naturalized alien plants in North America

    Science.gov (United States)

    Hong Qian; Qinfeng. Guo

    2010-01-01

    Aim Biotic homogenization is a growing phenomenon and has recently attracted much attention. Here, we analyse a large dataset of native and alien plants in North America to examine whether biotic homogenization is related to several ecological and biological attributes. Location North America (north of Mexico). Methods We assembled...

  10. The evaporative vector: Homogeneous systems

    International Nuclear Information System (INIS)

    Klots, C.E.

    1987-05-01

    Molecular beams of van der Waals molecules are the subject of much current research. Among the methods used to form these beams, three-sputtering, laser ablation, and the sonic nozzle expansion of neat gases - yield what are now recognized to be ''warm clusters.'' They contain enough internal energy to undergo a number of first-order processes, in particular that of evaporation. Because of this evaporation and its attendant cooling, the properties of such clusters are time-dependent. The states of matter which can be arrived at via an evaporative vector on a typical laboratory time-scale are discussed. Topics include the (1) temperatures, (2) metastability, (3) phase transitions, (4) kinetic energies of fragmentation, and (5) the expression of magical properties, all for evaporating homogeneous clusters

  11. Physical acoustics principles and methods

    CERN Document Server

    Mason, Warren P

    1964-01-01

    Physical Acoustics: Principles and Methods, Volume l-Part A focuses on high frequency sound waves in gases, liquids, and solids that have been proven as powerful tools in analyzing the molecular, defect, domain wall, and other types of motions. The selection first tackles wave propagation in fluids and normal solids and guided wave propagation in elongated cylinders and plates. Discussions focus on fundamentals of continuum mechanics; small-amplitude waves in a linear viscoelastic medium; representation of oscillations and waves; and special effects associated with guided elastic waves in plat

  12. Verification of homogenization in fast critical assembly analyses

    International Nuclear Information System (INIS)

    Chiba, Go

    2006-01-01

    In the present paper, homogenization procedures for fast critical assembly analyses are investigated. Errors caused by homogenizations are evaluated by the exact perturbation theory. In order to obtain reference solutions, three-dimensional plate-wise transport calculations are performed. It is found that the angular neutron flux along plate boundaries has a significant peak in the fission source energy range. To treat this angular dependence accurately, the double-Gaussian Chebyshev angular quadrature set with S 24 is applied. It is shown that the difference between the heterogeneous leakage theory and the homogeneous theory is negligible, and that transport cross sections homogenized with neutron flux significantly underestimate neutron leakage. The error in criticality caused by a homogenization is estimated at about 0.1%Δk/kk' in a small fast critical assembly. In addition, the neutron leakage is overestimated by both leakage theories when sodium plates in fuel lattices are voided. (author)

  13. Arc melting and homogenization of ZrC and ZrC + B alloys

    Science.gov (United States)

    Darolia, R.; Archbold, T. F.

    1973-01-01

    A description is given of the methods used to arc-melt and to homogenize near-stoichiometric ZrC and ZrC-boron alloys, giving attention to the oxygen contamination problem. The starting material for the carbide preparation was ZrC powder with an average particle size of 4.6 micron. Pellets weighing approximately 3 g each were prepared at room temperature from the powder by the use of an isostatic press operated at 50,000 psi. These pellets were individually melted in an arc furnace containing a static atmosphere of purified argon. A graphite resistance furnace was used for the homogenization process.

  14. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.; Kronsbein, Cornelia; Legoll, Fré dé ric

    2015-01-01

    it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison

  15. Homogeneity analysis with k sets of variables: An alternating least squares method with optimal scaling features

    NARCIS (Netherlands)

    van der Burg, Eeke; de Leeuw, Jan; Verdegaal, Renée

    1988-01-01

    Homogeneity analysis, or multiple correspondence analysis, is usually applied tok separate variables. In this paper we apply it to sets of variables by using sums within sets. The resulting technique is called OVERALS. It uses the notion of optimal scaling, with transformations that can be multiple

  16. Fabrication of homogeneous titania/MWNT composite materials

    International Nuclear Information System (INIS)

    Korbely, Barbara; Nemeth, Zoltan; Reti, Balazs; Seo, Jin Won; Magrez, Arnaud; Forro, Laszlo; Hernadi, Klara

    2011-01-01

    Highlights: → Homogenous titania coverage on MWNT surface in a controllable way. → Various titanium alkoxy precursors are suitable for layer formation. → Acetone and ethanol are the best to promote interaction between MWNT and titania. -- Abstract: MWNT/titania nanocomposites were prepared by an impregnation method and subsequent heat treatment at 400 o C. Precursor compounds such as titanium (IV) propoxide and titanium (IV) ethoxide were used to cover the surface of CNTs under solution conditions. Electron microscopy and X-ray diffraction techniques were carried out to characterize the as-prepared titania layers.

  17. Numerical perturbative methods in the quantum theory of physical systems

    International Nuclear Information System (INIS)

    Adam, G.

    1980-01-01

    During the last two decades, development of digital electronic computers has led to the deployment of new, distinct methods in theoretical physics. These methods, based on the advances of modern numerical analysis as well as on specific equations describing physical processes, enabled to perform precise calculations of high complexity which have completed and sometimes changed our image of many physical phenomena. Our efforts have concentrated on the development of numerical methods with such intrinsic performances as to allow a successful approach of some Key issues in present theoretical physics on smaller computation systems. The basic principle of such methods is to translate, in numerical analysis language, the theory of perturbations which is suited to numerical rather than to analytical computation. This idea has been illustrated by working out two problems which arise from the time independent Schroedinger equation in the non-relativistic approximation, within both quantum systems with a small number of particles and systems with a large number of particles, respectively. In the first case, we are led to the numerical solution of some quadratic ordinary differential equations (first section of the thesis) and in the second case, to the solution of some secular equations in the Brillouin area (second section). (author)

  18. PhySIC: a veto supertree method with desirable properties.

    Science.gov (United States)

    Ranwez, Vincent; Berry, Vincent; Criscuolo, Alexis; Fabre, Pierre-Henri; Guillemot, Sylvain; Scornavacca, Celine; Douzery, Emmanuel J P

    2007-10-01

    This paper focuses on veto supertree methods; i.e., methods that aim at producing a conservative synthesis of the relationships agreed upon by all source trees. We propose desirable properties that a supertree should satisfy in this framework, namely the non-contradiction property (PC) and the induction property (PI). The former requires that the supertree does not contain relationships that contradict one or a combination of the source topologies, whereas the latter requires that all topological information contained in the supertree is present in a source tree or collectively induced by several source trees. We provide simple examples to illustrate their relevance and that allow a comparison with previously advocated properties. We show that these properties can be checked in polynomial time for any given rooted supertree. Moreover, we introduce the PhySIC method (PHYlogenetic Signal with Induction and non-Contradiction). For k input trees spanning a set of n taxa, this method produces a supertree that satisfies the above-mentioned properties in O(kn(3) + n(4)) computing time. The polytomies of the produced supertree are also tagged by labels indicating areas of conflict as well as those with insufficient overlap. As a whole, PhySIC enables the user to quickly summarize consensual information of a set of trees and localize groups of taxa for which the data require consolidation. Lastly, we illustrate the behaviour of PhySIC on primate data sets of various sizes, and propose a supertree covering 95% of all primate extant genera. The PhySIC algorithm is available at http://atgc.lirmm.fr/cgi-bin/PhySIC.

  19. Experimental and numerical investigation of hetero-/homogeneous combustion-based HCCI of methane–air mixtures in free-piston micro-engines

    International Nuclear Information System (INIS)

    Chen, Junjie; Liu, Baofang; Gao, Xuhui; Xu, Deguang

    2016-01-01

    Highlights: • Single-shot experiments and a transient model of micro-engine were presented. • Coupled combustion can significantly improve in-cylinder temperatures. • Coupled combustion can reduce mass losses and compression ratios. • Heterogeneous reactions cause earlier ignition. • Heat losses result in higher mass losses. - Abstract: The hetero-/homogenous combustion-based HCCI (homogeneous charge compression ignition) of fuel–lean methane–air mixtures over alumina-supported platinum catalysts was investigated experimentally and numerically in free-piston micro-engines without ignition sources. Single-shot experiments were carried out in the purely homogeneous and coupled hetero-/homogeneous combustion modes, involved temperature measurements, capturing the visible combustion image sequences, exhaust gas analysis, and the physicochemical characterization of catalysts. Simulations were performed with a two-dimensional transient model that includes detailed hetero-/homogeneous chemistry and transport, leakage, and free-piston motion to gain physical insight and to explore the hetero-/homogeneous combustion characteristics. The micro-engine performance concerning combustion efficiency, mass loss, energy density, and free-piston dynamics was investigated. The results reveal that both purely homogeneous and coupled hetero-/homogeneous combustion of methane–air mixtures in a narrow cylinder with a diameter of 3 mm and a height of approximately 0.3 mm are possible. The coupled hetero-/homogeneous mode can not only significantly improve the combustion efficiency, in-cylinder temperature and pressure, output power and energy density, but also reduce the mass loss because of its lower compression ratio and less time spent around TDC (top dead center) and during the expansion stroke, indicating that this coupled mode is a promising combustion scheme for micro-engine. Heat losses result in higher mass losses. Heterogeneous reactions cause earlier ignition

  20. Chaos Control on a Duopoly Game with Homogeneous Strategy

    Directory of Open Access Journals (Sweden)

    Manying Bai

    2016-01-01

    Full Text Available We study the dynamics of a nonlinear discrete-time duopoly game, where the players have homogenous knowledge on the market demand and decide their outputs based on adaptive expectation. The Nash equilibrium and its local stability are investigated. The numerical simulation results show that the model may exhibit chaotic phenomena. Quasiperiodicity is also found by setting the parameters at specific values. The system can be stabilized to a stable state by using delayed feedback control method. The discussion of control strategy shows that the effect of both firms taking control method is better than that of single firm taking control method.

  1. Homogenization in powder compacts of UO2-PuO2

    International Nuclear Information System (INIS)

    Verma, R.

    1979-01-01

    The homogenization kinetics in mixed UO 2 -PuO 2 compacts have been studied by adopting a concentric core-shell model of diffusion. An equation relating the extent of homogenization expressed in terms of the fraction of UO 2 remaining undissolved and the time of annealing has been derived. From the equation, the periods required at different annealing temperatures to attain a specified level of homogenization have been calculated. These calculated homogenization times have been found to be in fair agreement with the experimentally observed homogenization times. The derived relationship has also been shown to satisfactorily predict homogenization in Cu-Ni powder compacts. (Auth.)

  2. A Proposed Stochastic Finite Difference Approach Based on Homogenous Chaos Expansion

    Directory of Open Access Journals (Sweden)

    O. H. Galal

    2013-01-01

    Full Text Available This paper proposes a stochastic finite difference approach, based on homogenous chaos expansion (SFDHC. The said approach can handle time dependent nonlinear as well as linear systems with deterministic or stochastic initial and boundary conditions. In this approach, included stochastic parameters are modeled as second-order stochastic processes and are expanded using Karhunen-Loève expansion, while the response function is approximated using homogenous chaos expansion. Galerkin projection is used in converting the original stochastic partial differential equation (PDE into a set of coupled deterministic partial differential equations and then solved using finite difference method. Two well-known equations were used for efficiency validation of the method proposed. First one being the linear diffusion equation with stochastic parameter and the second is the nonlinear Burger's equation with stochastic parameter and stochastic initial and boundary conditions. In both of these examples, the probability distribution function of the response manifested close conformity to the results obtained from Monte Carlo simulation with optimized computational cost.

  3. Functional homogeneous zones (fHZs) in viticultural zoning procedure: an Italian case study on Aglianico vine

    Science.gov (United States)

    Bonfante, A.; Agrillo, A.; Albrizio, R.; Basile, A.; Buonomo, R.; De Mascellis, R.; Gambuti, A.; Giorio, P.; Guida, G.; Langella, G.; Manna, P.; Minieri, L.; Moio, L.; Siani, T.; Terribile, F.

    2015-06-01

    This paper aims to test a new physically oriented approach to viticulture zoning at farm scale that is strongly rooted in hydropedology and aims to achieve a better use of environmental features with respect to plant requirements and wine production. The physics of our approach are defined by the use of soil-plant-atmosphere simulation models, applying physically based equations to describe the soil hydrological processes and solve soil-plant water status. This study (part of the ZOVISA project) was conducted on a farm devoted to production of high-quality wines (Aglianico DOC), located in southern Italy (Campania region, Mirabella Eclano, AV). The soil spatial distribution was obtained after standard soil survey informed by geophysical survey. Two homogeneous zones (HZs) were identified; in each one a physically based model was applied to solve the soil water balance and estimate the soil functional behaviour (crop water stress index, CWSI) defining the functional homogeneous zones (fHZs). For the second process, experimental plots were established and monitored for investigating soil-plant water status, crop development (biometric and physiological parameters) and daily climate variables (temperature, solar radiation, rainfall, wind). The effects of crop water status on crop response over must and wine quality were then evaluated in the fHZs. This was performed by comparing crop water stress with (i) crop physiological measurement (leaf gas exchange, chlorophyll a fluorescence, leaf water potential, chlorophyll content, leaf area index (LAI) measurement), (ii) grape bunches measurements (berry weight, sugar content, titratable acidity, etc.) and (iii) wine quality (aromatic response). This experiment proved the usefulness of the physically based approach, also in the case of mapping viticulture microzoning.

  4. Homogeneity and Entropy

    Science.gov (United States)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  5. Selection of suitable prodrug candidates for in vivo studies via in vitro studies; the correlation of prodrug stability in between cell culture homogenates and human tissue homogenates.

    Science.gov (United States)

    Tsume, Yasuhiro; Amidon, Gordon L

    2012-01-01

    To determine the correlations/discrepancies of drug stabilities between in the homogenates of human culture cells and of human tissues. Amino acid/dipeptide monoester prodrugs of floxuridine were chosen as the model drugs. The stabilities (half-lives) of floxuridine prodrugs in human tissues (pancreas, liver, and small intestine) homogenates were obtained and compared with ones in cell culture homogenates (AcPC-1, Capan-2, and Caco-2 cells) as well as human liver microsomes. The correlations of prodrug stability in human small bowel tissue homogenate vs. Caco-2 cell homogenate, human liver tissue homogenate vs. human liver microsomes, and human pancreatic tissue homogenate vs. pancreatic cell, AsPC-1 and Capan-2, homogenates were examined. The stabilities of floxuridine prodrugs in human small bowel homogenate exhibited the great correlation to ones in Caco-2 cell homogenate (slope = 1.0-1.3, r2 = 0.79-0.98). The stability of those prodrugs in human pancreas tissue homogenate also exhibited the good correlations to ones in AsPC-1 and Capan-2 cells homogenates (slope = 0.5-0.8, r2 = 0.58-0.79). However, the correlations of prodrug stabilities between in human liver tissue homogenates and in human liver microsomes were weaker than others (slope = 1.3-1.9, r2 = 0.07-0.24). The correlations of drug stabilities in cultured cell homogenates and in human tissue homogenates were compared. Those results exhibited wide range of correlations between in cell homogenate and in human tissue homogenate (r2 = 0.07 - 0.98). Those in vitro studies in cell homogenates would be good tools to predict drug stabilities in vivo and to select drug candidates for further developments. In the series of experiments, 5'-O-D-valyl-floxuridine and 5'-O-L-phenylalanyl-L-tyrosyl-floxuridine would be selected as candidates of oral drug targeting delivery for cancer chemotherapy due to their relatively good stabilities compared to other tested prodrugs.

  6. Quasi-homogenous approximation for description of the properties of dispersed systems. The basic approaches to model hardening processes in nanodispersed silica systems. Part 1. Statical polymer method

    Directory of Open Access Journals (Sweden)

    KUDRYAVTSEV Pavel Gennadievich

    2015-02-01

    Full Text Available The paper deals with possibilities to use quasi-homogenous approximation for discription of properties of dispersed systems. The authors applied statistical polymer method based on consideration of average structures of all possible macromolecules of the same weight. The equiations which allow evaluating many additive parameters of macromolecules and the systems with them were deduced. Statistical polymer method makes it possible to model branched, cross-linked macromolecules and the systems with them which are in equilibrium or non-equilibrium state. Fractal analysis of statistical polymer allows modeling different types of random fractal and other objects examined with the mehods of fractal theory. The method of fractal polymer can be also applied not only to polymers but also to composites, gels, associates in polar liquids and other packaged systems. There is also a description of the states of colloid solutions of silica oxide from the point of view of statistical physics. This approach is based on the idea that colloid solution of silica dioxide – sol of silica dioxide – consists of enormous number of interacting particles which are always in move. The paper is devoted to the research of ideal system of colliding but not interacting particles of sol. The analysis of behavior of silica sol was performed according to distribution Maxwell-Boltzmann and free path length was calculated. Using this data the number of the particles which can overcome the potential barrier in collision was calculated. To model kinetics of sol-gel transition different approaches were studied.

  7. BPS black holes in a non-homogeneous deformation of the stu model of N=2, D=4 gauged supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Klemm, Dietmar [Dipartimento di Fisica, Università di Milano, and INFN - Sezione di Milano,Via Celoria 16, I-20133 Milano (Italy); Marrani, Alessio [Centro Studi e Ricerche ‘Enrico Fermi’, Via Panisperna 89A, I-00184 Roma (Italy); Dipartimento di Fisica e Astronomia ‘Galileo Galilei’, Università di Padova, and INFN - Sezione di Padova,Via Marzolo 8, I-35131 Padova (Italy); Petri, Nicolò; Santoli, Camilla [Dipartimento di Fisica, Università di Milano, and INFN - Sezione di Milano,Via Celoria 16, I-20133 Milano (Italy)

    2015-09-29

    We consider a deformation of the well-known stu model of N=2, D=4 supergravity, characterized by a non-homogeneous special Kähler manifold, and by the smallest electric-magnetic duality Lie algebra consistent with its upliftability to five dimensions. We explicitly solve the BPS attractor equations and construct static supersymmetric black holes with radial symmetry, in the context of U(1) dyonic Fayet-Iliopoulos gauging, focussing on axion-free solutions. Due to non-homogeneity of the scalar manifold, the model evades the analysis recently given in the literature. The relevant physical properties of the resulting black hole solution are discussed.

  8. Non-homogeneous Markov process models with informative observations with an application to Alzheimer's disease.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2011-05-01

    Identifying risk factors for transition rates among normal cognition, mildly cognitive impairment, dementia and death in an Alzheimer's disease study is very important. It is known that transition rates among these states are strongly time dependent. While Markov process models are often used to describe these disease progressions, the literature mainly focuses on time homogeneous processes, and limited tools are available for dealing with non-homogeneity. Further, patients may choose when they want to visit the clinics, which creates informative observations. In this paper, we develop methods to deal with non-homogeneous Markov processes through time scale transformation when observation times are pre-planned with some observations missing. Maximum likelihood estimation via the EM algorithm is derived for parameter estimation. Simulation studies demonstrate that the proposed method works well under a variety of situations. An application to the Alzheimer's disease study identifies that there is a significant increase in transition rates as a function of time. Furthermore, our models reveal that the non-ignorable missing mechanism is perhaps reasonable. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Substrate specificity and pH dependence of homogeneous wheat germ acid phosphatase.

    Science.gov (United States)

    Van Etten, R L; Waymack, P P

    1991-08-01

    The broad substrate specificity of a homogeneous isoenzyme of wheat germ acid phosphatase (WGAP) was extensively investigated by chromatographic, electrophoretic, NMR, and kinetic procedures. WGAP exhibited no divalent metal ion requirement and was unaffected upon incubation with EDTA or o-phenanthroline. A comparison of two catalytically homogeneous isoenzymes revealed little difference in substrate specificity. The specificity of WGAP was established by determining the Michaelis constants for a wide variety of substrates. p-Nitrophenyl phosphate, pyrophosphate, tripolyphosphate, and ATP were preferred substrates while lesser activities were seen toward sugar phosphates, trimetaphosphate, phosphoproteins, and (much less) phosphodiesters. An extensive table of Km and Vmax values is given. The pathway for the hydrolysis of trimetaphosphate was examined by colorimetric and 31P NMR methods and it was found that linear tripolyphosphate is not a free intermediate in the enzymatic reaction. In contrast to literature reports, homogeneous wheat germ acid phosphatase exhibits no measurable carboxylesterase activity, nor does it hydrolyze phenyl phosphonothioate esters or phytic acid at significant rates.

  10. Application of nuclear-physical methods for studies in the solid state physics area

    International Nuclear Information System (INIS)

    Gorlachrv, I.D.; Knyazev, B.B.; Platov, A.B.

    2004-01-01

    The set of nuclear-physical methods developed on the heavy ion accelerator at the Institute of Nuclear Physics of the National Nuclear Center of the Republic of Kazakhstan allows to conduct an examination of elementary content as well as to obtain the elements distribution in a sample in their depth and surface. This information could be very important for study of samples wide range integral parameters and the characteristics of sputtered layers and implanted films. The beam analysis methods, as well as Rutherford backscattering methods (RBS), nuclear reaction analysis (NRA), proton-induced X-ray emission analysis (PIXE) are included in the complex structure. Besides for expand an analyzed elements range and precision increase for quantitative characteristics of elementarily content of samples the X-ray florescent analysis method with isotope excitation (RFA) is using in the capacity complementary PIXE method. Modernization of proton beam transportation system at the heavy ion accelerator allows to develop a new analytical trend - combination of the proton micro-probe with PIXE analysis. In this case the information about examined sample elementary content is within size field ∼10 μm. The beam scanning by the surface is allowing to obtain the elements distribution by the two spatial coordinates connected with the surface. This information may be useful in the case of an existence of a micro-inclusions in the sample

  11. Radiochemical analysis of homogeneously solidified low level radioactive waste from nuclear power plants

    International Nuclear Information System (INIS)

    Sato, Kaneaki; Ikeuchi, Yoshihiro; Higuchi, Hideo

    1995-01-01

    As mentioned above, we have reliable radioanalytical methods for all kinds of homogeneously solidified wastes. We are now under studying an analytical method for pellets which are made from evaporator concentrates or resin. And we are going to study to establish new analytical method for the rad-waste including metal, cloths and so on in near future. (J.P.N.)

  12. Sewage sludge disintegration by high-pressure homogenization: a sludge disintegration model.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Ma, Boqiang; Wu, Hao; Zhang, Sheng; Xu, Xin

    2012-01-01

    High-pressure homogenization (HPH) technology was applied as a pretreatment to disintegrate sewage sludge. The effects of homogenization pressure, homogenization cycle number, and total solid content on sludge disintegration were investigated. The sludge disintegration degree (DD(COD)), protein concentration, and polysaccharide concentration increased with the increase of homogenization pressure and homogenization cycle number, and decreased with the increase of sludge total solid (TS) content. The maximum DD(COD) of 43.94% was achieved at 80 MPa with four homogenization cycles for a 9.58 g/L TS sludge sample. A HPH sludge disintegration model of DD(COD) = kNaPb was established by multivariable linear regression to quantify the effects of homogenization parameters. The homogenization cycle exponent a and homogenization pressure exponent b were 0.4763 and 0.7324 respectively, showing that the effect of homogenization pressure (P) was more significant than that of homogenization cycle number (N). The value of the rate constant k decreased with the increase of sludge total solid content. The specific energy consumption increased with the increment of sludge disintegration efficiency. Lower specific energy consumption was required for higher total solid content sludge.

  13. Genome-Wide Linkage Analysis of Hemodynamic Parameters Under Mental and Physical Stress in Extended Omani Arab Pedigrees : The Oman Family Study

    NARCIS (Netherlands)

    Hassan, Mohammed O.; Jaju, Deepali; Voruganti, V. Saroja; Bayoumi, Riad A.; Albarwani, Sulayma; Al-Yahyaee, Saeed; Aslani, Afshin; Snieder, Harold; Lopez-Alvarenga, Juan C.; Al-Anqoudi, Zahir M.; Alizadeh, Behrooz Z.; Comuzzie, Anthony G.

    Background: We performed a genome-wide scan in a homogeneous Arab population to identify genomic regions linked to blood pressure (BP) and its intermediate phenotypes during mental and physical stress tests. Methods: The Oman Family Study subjects (N = 1277) were recruited from five extended

  14. On the existence of conformal Killing vectors for ST-homogeneous Godel type space-times

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Y.; Patino, A.; Percoco, U. [Laboratorio de Fisica Teorica, Facultad de Ciencias Universidad de los Andes, Merida 5101 (Venezuela); Tsamparlis, M. [seccion de Astronomia-Astrofisica-Mecanica, Universidad de Atenas, Atenas 157 83 (Greece)

    2006-07-01

    Tsamparlis with another authors have developed a systematic method for computing of the conformal algebra of 1+3 space-times. The proper CKV's are found in terms of gradient CKVs of the 3-space. In this paper we apply Tsamparlis' results to the study CKVs of the Godel ST-Homogeneous type spacetimes. We find that the only space-time admitting proper CKV's is the ST-Homogeneous Godel type with m{sup 2} = 4{omega}{sup 2} (RT). (Author)

  15. Monte Carlo-based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1979-04-01

    The results are presented of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  16. Monte Carlo; based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1978-11-01

    The results are summarized of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  17. Fluidic delivery of homogeneous solutions through carbon tube bundles

    International Nuclear Information System (INIS)

    Srikar, R; Yarin, A L; Megaridis, C M

    2009-01-01

    A wide array of technological applications requires localized high-rate delivery of dissolved compounds (in particular, biological ones), which can be achieved by forcing the solutions or suspensions of such compounds through nano or microtubes and their bundled assemblies. Using a water-soluble compound, the fluorescent dye Rhodamine 610 chloride, frequently used as a model drug release compound, it is shown that deposit buildup on the inner walls of the delivery channels and its adverse consequences pose a severe challenge to implementing pressure-driven long-term fluidic delivery through nano and microcapillaries, even in the case of such homogeneous solutions. Pressure-driven delivery (3-6 bar) of homogeneous dye solutions through macroscopically-long (∼1 cm) carbon nano and microtubes with inner diameters in the range 100 nm-1 μm and their bundled parallel assemblies is studied experimentally and theoretically. It is shown that the flow delivery gradually shifts from fast convection-dominated (unobstructed) to slow jammed convection, and ultimately to diffusion-limited transport through a porous deposit. The jamming/clogging phenomena appear to be rather generic: they were observed in a wide concentration range for two fluorescent dyes in carbon nano and microtubes, as well as in comparable transparent glass microcapillaries. The aim of the present work is to study the physics of jamming, rather than the chemical reasons for the affinity of dye molecules to the tube walls.

  18. Homogeneous cosmology with aggressively expanding civilizations

    International Nuclear Information System (INIS)

    Jay Olson, S

    2015-01-01

    In the context of a homogeneous Universe, we note that the appearance of aggressively expanding advanced life is geometrically similar to the process of nucleation and bubble growth in a first-order cosmological phase transition. We exploit this similarity to describe the dynamics of life saturating the Universe on a cosmic scale, adapting the phase transition model to incorporate probability distributions of expansion and resource consumption strategies. Through a series of numerical solutions spanning several orders of magnitude in the input assumption parameters, the resulting cosmological model is used to address basic questions related to the intergalactic spreading of life, dealing with issues such as timescales, observability, competition between strategies, and first-mover advantage. Finally, we examine physical effects on the Universe itself, such as reheating and the backreaction on the evolution of the scale factor, if such life is able to control and convert a significant fraction of the available pressureless matter into radiation. We conclude that the existence of life, if certain advanced technologies are practical, could have a significant influence on the future large-scale evolution of the Universe. (paper)

  19. The generator coordinate method in nuclear physics

    International Nuclear Information System (INIS)

    Giraud, B.G.

    1981-01-01

    The generator coordinate method is introduced as a physical description of a N-body system in a subspace of a reduced number of degrees of freedom. Special attention is placed on the identification of these special, 'collective' degrees of freedom. It is shown in particular that the method has close links with the Born-Oppenheimer approximation and also that considerations of differential geometry are useful in the theory. A set of applications is discussed and in particular the case of nuclear collisions is considered. (Author) [pt

  20. Modelization of physical phenomena in research reactors with the help of new developments in transport methods, and methodology validation with experimental data

    International Nuclear Information System (INIS)

    Rauck, St.

    2000-10-01

    The aim of this work is to develop a scheme for experimental reactors, based on transport equations. This type of reactors is characterized by a small core, a complex, very heterogeneous geometry and a large leakage. The possible insertion of neutron beams in the reflector and the presence of absorbers in the core increase the difficulty of the 3D-geometrical description and the physical modeling of the component parameters of the reactor. The Orphee reactor has been chosen for our study. Physical models (homogenization, collapsing cross section in few groups, albedo multigroup condition) have been developed in the APOLLO2 and CRONOS2 codes to calculate flux and power maps in a 3D-geometry, with different burnup and through transport equations. Comparisons with experimental measurements have shown the interest of taking into account anisotropy, steep flux gradients by using Sn methods, and on the other hand using a 12-group cross section library. The modeling of neutron beams has been done outside the core modeling through Monte Carlo calculations and with the total geometry, including a large thickness of heavy water. Thanks to this calculations, one can evaluate the neutron beams anti-reactivity and determinate the core cycle. We assure these methods more accurate than usual transport-diffusion calculations will be used for the conception of new research reactors. (author)

  1. Combined use of nanocarriers and physical methods for percutaneous penetration enhancement.

    Science.gov (United States)

    Dragicevic, Nina; Maibach, Howard

    2018-02-06

    Dermal and transdermal drug delivery (due to its non-invasiveness, avoidance of the first-pass metabolism, controlling the rate of drug input over a prolonged time, etc.) have gained significant acceptance. Several methods are employed to overcome the permeability barrier of the skin, improving drug penetration into/through skin. Among chemical penetration enhancement methods, nanocarriers have been extensively studied. When applied alone, nanocarriers mostly deliver drugs to skin and can be used to treat skin diseases. To achieve effective transdermal drug delivery, nanocarriers should be applied with physical methods, as they act synergistically in enhancing drug penetration. This review describes combined use of frequently used nanocarriers (liposomes, novel elastic vesicles, lipid-based and polymer-based nanoparticles and dendrimers) with the most efficient physical methods (microneedles, iontophoresis, ultrasound and electroporation) and demonstrates superiority of the combined use of nanocarriers and physical methods in drug penetration enhancement compared to their single use. Copyright © 2018. Published by Elsevier B.V.

  2. Solidification Segregation and Homogenization Behavior of 1Cr-1.25Mo-0.25V Steel Ingot

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong-Bae [Dae-gu Mechatronics and Materials Institute, Daegu (Korea, Republic of); Na, Young-Sang; Seo, Seong-Moon [Korea Institute of Materials Science, Changwon (Korea, Republic of); Lee, Je-Hyun [Changwon National University, Changwon (Korea, Republic of)

    2016-09-15

    As a first step to optimizing the homogenization heat treatment following high temperature upset forging, the solidification segregation and the homogenization behaviors of solute elements were quantitatively analyzed for 1Cr-1.25Mo-0.25V steel ingot by electron probe micro-analysis (EPMA). The random sampling approach, which was designed to generate continuous compositional profiles of each solute element, was employed to clarify the segregation and homogenization behaviors. In addition, ingot castings of lab-scale and a 16-ton-sized 1Cr-1.25Mo-0.25V steel were simulated using the finite element method in three dimensions to understand the size effect of the ingot on the microsegregation and its reduction during the homogenization heat treatment. It was found that the microsegregation in a large-sized ingot was significantly reduced by the promotion of solid state diffusion due to the extremely low cooling rate. On the other hand, from the homogenization point of view, increasing the ingot size causes a dramatic increase in the dendrite arm spacing, and hence the homogenization of microsegregation in a large-sized ingot appears to be practically difficult.

  3. The offset-midpoint traveltime pyramid of P-waves in homogeneous orthorhombic media

    KAUST Repository

    Hao, Qi

    2016-07-18

    The offset-midpoint traveltime pyramid describes the diffraction traveltime of a point diffractor in homogeneous media. We have developed an analytic approximation for the P-wave offset-midpoint traveltime pyramid for homogeneous orthorhombic media. In this approximation, a perturbation method and the Shanks transform were implemented to derive the analytic expressions for the horizontal slowness components of P-waves in orthorhombic media. Numerical examples were shown to analyze the proposed traveltime pyramid formula and determined its accuracy and the application in calculating migration isochrones and reflection traveltime. The proposed offset-midpoint traveltime formula is useful for Kirchhoff prestack time migration and migration velocity analysis for orthorhombic media.

  4. The offset-midpoint traveltime pyramid of P-waves in homogeneous orthorhombic media

    KAUST Repository

    Hao, Qi; Stovas, Alexey; Alkhalifah, Tariq Ali

    2016-01-01

    The offset-midpoint traveltime pyramid describes the diffraction traveltime of a point diffractor in homogeneous media. We have developed an analytic approximation for the P-wave offset-midpoint traveltime pyramid for homogeneous orthorhombic media. In this approximation, a perturbation method and the Shanks transform were implemented to derive the analytic expressions for the horizontal slowness components of P-waves in orthorhombic media. Numerical examples were shown to analyze the proposed traveltime pyramid formula and determined its accuracy and the application in calculating migration isochrones and reflection traveltime. The proposed offset-midpoint traveltime formula is useful for Kirchhoff prestack time migration and migration velocity analysis for orthorhombic media.

  5. Inverse operator theory method and its applications in nonlinear physics

    International Nuclear Information System (INIS)

    Fang Jinqing

    1993-01-01

    Inverse operator theory method, which has been developed by G. Adomian in recent years, and its applications in nonlinear physics are described systematically. The method can be an unified effective procedure for solution of nonlinear and/or stochastic continuous dynamical systems without usual restrictive assumption. It is realized by Mathematical Mechanization by us. It will have a profound on the modelling of problems of physics, mathematics, engineering, economics, biology, and so on. Some typical examples of the application are given and reviewed

  6. Estimating soil hydrological response by combining precipitation-runoff modeling and hydro-functional soil homogeneous units

    Science.gov (United States)

    Aroca-Jimenez, Estefania; Bodoque, Jose Maria; Diez-Herrero, Andres

    2015-04-01

    Flash floods constitute one of the natural hazards better able to generate risk, particularly with regard to Society. The complexity of this process and its dependence on various factors related to the characteristics of the basin and rainfall make flash floods are difficult to characterize in terms of their hydrological response.To do this, it is essential a proper analysis of the so called 'initial abstractions'. Among all of these processes, infiltration plays a crucial role in explaining the occurrence of floods in mountainous basins.For its characterization the Green-Ampt model , which depends on the characteristics of rainfall and physical properties of soil has been used in this work.This is a method enabling to simulate floods in mountainous basins where hydrological response is sub-daily. However, it has the disadvantage that it is based on physical properties of soil which have a high spatial variability. To address this difficulty soil mapping units have been delineated according to the geomorphological landforms and elements. They represent hydro-functional mapping units that are theoretically homogeneous from the perspective of the pedostructure parameters of the pedon. So the soil texture of each homogeneous group of landform units was studied by granulometric analyses using standarized sieves and Sedigraph devices. In addition, uncertainty associated with the parameterization of the Green-Ampt method has been estimated by implementing a Monte Carlo approach, which required assignment of the proper distribution function to each parameter.The suitability of this method was contrasted by calibrating and validating a hydrological model, in which the generation of runoff hydrograph has been simulated using the SCS unit hydrograph (HEC-GeoHMS software), while flood wave routing has been characterized using the Muskingum-Cunge method. Calibration and validation of the model was from the use of an automatic routine based on the employ of the search algorithm

  7. Numerical computation of homogeneous slope stability.

    Science.gov (United States)

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS).

  8. Japanese Fast Reactor Program for Homogeneous Actinide Recycling

    International Nuclear Information System (INIS)

    Ishikawa, Makoto; Nagata, Takashi; Kondo, Satoru

    2008-01-01

    In the present report, the homogeneous actinide recycling scenario of Fast Reactor (FR) Cycle Technology Development Project (FaCT) is summarized. First, the scenario of nuclear energy policy in Japan are briefly reviewed. Second, the basic plan of Japan to manage all minor actinide (MA) by recycling is summarized objectives of which are the efficiency increase of uranium resources, the environmental burden reduction, and the increase of nuclear non-proliferation potential. Third, recent results of reactor physics study related to MA-loaded FR cores are briefly described. Fourth, typical nuclear design of MA-loaded FR cores in the FaCT project and their main features are demonstrated with the feasibility to recycle all MA in the future FR equilibrium society. Finally, the research and development program to realize the MA recycling in Japan is introduced, including international cooperation projects. (authors)

  9. Assessment of the characteristics of MRI coils in terms of RF non-homogeneity using routine spin echo sequences

    International Nuclear Information System (INIS)

    Oghabian, M. A.; Mehdipour, Sh.; RiahicAlam, N.; Rafie, B.; Ghanaati, H.

    2005-01-01

    One of the major causes of image non-uniformity in MRI is due to the existence of non-homogeneity in RF receive and transmit. This can be the most effective source of error in quantitative studies in MRI imaging. Part of this non-homogeneity demonstrates the characteristics of RF coil and part of it is due to the interaction of RF field with the material being imaged. In this study, RF field non-homogeneity of surface and volume coils is measured using an oil phantom. The method employed in this work is based on a routine Spin Echo based sequence as proposed by this group previously. Materials and Methods: For the determination of RF non-uniformity, a method based on Spin Echo sequence (8θ-180) was used as reported previously by the same author. In this method, several images were obtained from one slice using different flip angles while keeping all other imaging parameters constant. Then, signal intensity at a ROI from all of these images were measured and fitted to the MRI defined mathematical model. Since this mathematical model describes the relation between signal intensity and flip angle in a (8θ-180) Spin Echo sequence, it is possible to obtain the variation in receive and transmit sensitivity in terms of the variation of signal intensity from the actual expected values. Since surface coils are functioning as only receiver (RF transmission is done by Body coil), first the results of receive coil homogeneity is measured, then characteristic of transmit coil (for the body coil) is evaluated Results: The coefficient of variation (C.V.) found for T(r) value obtained from images using head coils was in the order of 0.6%. Since the head coil is functioning as both transmitter and receiver, any non-uniformity in either transmit or receive stage can lead to non-homogeneity in RF field. A part from the surface coils, the amount of non-homogeneity due to receive coil was less than that of the transmit coil. In the case of the surface coils the variation in receive

  10. Breakthrough Propulsion Physics Project: Project Management Methods

    Science.gov (United States)

    Millis, Marc G.

    2004-01-01

    To leap past the limitations of existing propulsion, the NASA Breakthrough Propulsion Physics (BPP) Project seeks further advancements in physics from which new propulsion methods can eventually be derived. Three visionary breakthroughs are sought: (1) propulsion that requires no propellant, (2) propulsion that circumvents existing speed limits, and (3) breakthrough methods of energy production to power such devices. Because these propulsion goals are presumably far from fruition, a special emphasis is to identify credible research that will make measurable progress toward these goals in the near-term. The management techniques to address this challenge are presented, with a special emphasis on the process used to review, prioritize, and select research tasks. This selection process includes these key features: (a) research tasks are constrained to only address the immediate unknowns, curious effects or critical issues, (b) reliability of assertions is more important than the implications of the assertions, which includes the practice where the reviewers judge credibility rather than feasibility, and (c) total scores are obtained by multiplying the criteria scores rather than by adding. Lessons learned and revisions planned are discussed.

  11. New trends in reactor physics design methods

    International Nuclear Information System (INIS)

    Jagannathan, V.

    1993-01-01

    Reactor physics design methods are aimed at safe and efficient management of nuclear materials in a reactor core. The design methodologies require a high level of integration of different calculational modules of many a key areas like neutronics, thermal hydraulics, radiation transport etc in order to follow different 3-D phenomena under normal and transient operating conditions. The evolution of computer hardware technology is far more rapid than the software development and has rendered such integration a meaningful and realizable proposition. The aim of this paper is to assess the state of art of the physics design codes used in Indian thermal power reactor applications with respect to meeting the design, operational and safety requirements. (author). 50 refs

  12. Benchmarking lattice physics data and methods for boiling water reactor analysis

    International Nuclear Information System (INIS)

    Cacciapouti, R.J.; Edenius, M.; Harris, D.R.; Hebert, M.J.; Kapitz, D.M.; Pilat, E.E.; VerPlanck, D.M.

    1983-01-01

    The objective of the work reported was to verify the adequacy of lattice physics modeling for the analysis of the Vermont Yankee BWR using a multigroup, two-dimensional transport theory code. The BWR lattice physics methods have been benchmarked against reactor physics experiments, higher order calculations, and actual operating data

  13. A Synthetic Approach to the Transfer Matrix Method in Classical and Quantum Physics

    Science.gov (United States)

    Pujol, O.; Perez, J. P.

    2007-01-01

    The aim of this paper is to propose a synthetic approach to the transfer matrix method in classical and quantum physics. This method is an efficient tool to deal with complicated physical systems of practical importance in geometrical light or charged particle optics, classical electronics, mechanics, electromagnetics and quantum physics. Teaching…

  14. An overview of currently available methods and future trends for physical activity

    Directory of Open Access Journals (Sweden)

    Alexander Kiško

    2013-12-01

    Full Text Available Background: Methodological limitations make comparison of various instruments difficult, although the number of publications on physical activity assessment has extensively increased. Therefore, systematization of techniques and definitions is essential for the improvement of knowledge in the area. Objective: This paper systematically describes and compares up-to-date methods that assess habitual physical activity and discusses main issues regarding the use and interpretation of data collected with these techniques. Methods: A general outline of the measures and techniques described above is presented in review form, along with their respective definition, usual applications, positive aspects and shortcomings. Results and conclusions: The various factors to be considered in the selection of physical activity assessment methods include goals, sample size, budget, cultural and social/environmental factors, physical burden for the subject, and statistical factors, such as accuracy and precision. It is concluded that no single current technique is able to quantify all aspects of physical activity under free-living conditions, requiring the use of complementary methods. In not too distant future, devices will take advantage of consumer technologies, such as mobile phones, GPS devices. It is important to perform other activities, such as detecting and responding to physical activity in a real time, creating new opportunities in measurement, remote compliance monitoring, data-driven discovery and intervention.

  15. Exercises and problems in mathematical methods of physics

    CERN Document Server

    Cicogna, Giampaolo

    2018-01-01

    This book presents exercises and problems in the mathematical methods of physics with the aim of offering undergraduate students an alternative way to explore and fully understand the mathematical notions on which modern physics is based. The exercises and problems are proposed not in a random order but rather in a sequence that maximizes their educational value. Each section and subsection starts with exercises based on first definitions, followed by groups of problems devoted to intermediate and, subsequently, more elaborate situations. Some of the problems are unavoidably "routine", but others bring to the forenontrivial properties that are often omitted or barely mentioned in textbooks. There are also problems where the reader is guided to obtain important results that are usually stated in textbooks without complete proofs. In all, some 350 solved problems covering all mathematical notions useful to physics are included. While the book is intended primarily for undergraduate students of physics, students...

  16. Evaluation of effective material properties of spiral wound gasket through homogenization

    International Nuclear Information System (INIS)

    Mathan, G.; Siva Prasad, N.

    2010-01-01

    In this paper, a homogenization methodology is proposed to determine the material properties of spiral wound gaskets (SWGs) using finite element analysis through representative volume elements (RVE) of the gaskets. The constituents of this RVE are described by elasto-plastic material properties. The RVE are subjected to six load cases and the volume averaged responses are analyzed simultaneously to predict the anisotropic properties. The mechanical behaviour is simplified to an orthotropic material model with Hill's plasticity model and the properties are verified with micro-mechanical simulation and experimental results available in the literature. Reasonable agreement is obtained between the results. Formulae for elastic properties are also derived by a simplified analytical method based on lamination theory and compared with those obtained from homogenization.

  17. Evaluation of effective material properties of spiral wound gasket through homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Mathan, G. [Department of Mechanical Engineering, Indian Institute of Technology Madras, Chennai 600036 (India); Siva Prasad, N., E-mail: siva@iitm.ac.i [Department of Mechanical Engineering, Indian Institute of Technology Madras, Chennai 600036 (India)

    2010-12-15

    In this paper, a homogenization methodology is proposed to determine the material properties of spiral wound gaskets (SWGs) using finite element analysis through representative volume elements (RVE) of the gaskets. The constituents of this RVE are described by elasto-plastic material properties. The RVE are subjected to six load cases and the volume averaged responses are analyzed simultaneously to predict the anisotropic properties. The mechanical behaviour is simplified to an orthotropic material model with Hill's plasticity model and the properties are verified with micro-mechanical simulation and experimental results available in the literature. Reasonable agreement is obtained between the results. Formulae for elastic properties are also derived by a simplified analytical method based on lamination theory and compared with those obtained from homogenization.

  18. Numerical Studies of Homogenization under a Fast Cellular Flow

    KAUST Repository

    Iyer, Gautam

    2012-09-13

    We consider a two dimensional particle diffusing in the presence of a fast cellular flow confined to a finite domain. If the flow amplitude A is held fixed and the number of cells L 2 →∞, then the problem homogenizes; this has been well studied. Also well studied is the limit when L is fixed and A→∞. In this case the solution averages along stream lines. The double limit as both the flow amplitude A→∞and the number of cells L 2 →∞was recently studied [G. Iyer et al., preprint, arXiv:1108.0074]; one observes a sharp transition between the homogenization and averaging regimes occurring at A = L 2. This paper numerically studies a few theoretically unresolved aspects of this problem when both A and L are large that were left open in [G. Iyer et al., preprint, arXiv:1108.0074] using the numerical method devised in [G. A. Pavliotis, A. M. Stewart, and K. C. Zygalakis, J. Comput. Phys., 228 (2009), pp. 1030-1055]. Our treatment of the numerical method uses recent developments in the theory of modified equations for numerical integrators of stochastic differential equations [K. C. Zygalakis, SIAM J. Sci. Comput., 33 (2001), pp. 102-130]. © 2012 Society for Industrial and Applied Mathematics.

  19. Numerical Studies of Homogenization under a Fast Cellular Flow

    KAUST Repository

    Iyer, Gautam; Zygalakis, Konstantinos C.

    2012-01-01

    We consider a two dimensional particle diffusing in the presence of a fast cellular flow confined to a finite domain. If the flow amplitude A is held fixed and the number of cells L 2 →∞, then the problem homogenizes; this has been well studied. Also well studied is the limit when L is fixed and A→∞. In this case the solution averages along stream lines. The double limit as both the flow amplitude A→∞and the number of cells L 2 →∞was recently studied [G. Iyer et al., preprint, arXiv:1108.0074]; one observes a sharp transition between the homogenization and averaging regimes occurring at A = L 2. This paper numerically studies a few theoretically unresolved aspects of this problem when both A and L are large that were left open in [G. Iyer et al., preprint, arXiv:1108.0074] using the numerical method devised in [G. A. Pavliotis, A. M. Stewart, and K. C. Zygalakis, J. Comput. Phys., 228 (2009), pp. 1030-1055]. Our treatment of the numerical method uses recent developments in the theory of modified equations for numerical integrators of stochastic differential equations [K. C. Zygalakis, SIAM J. Sci. Comput., 33 (2001), pp. 102-130]. © 2012 Society for Industrial and Applied Mathematics.

  20. The design of coils for the production of high homogeneous fields; Calcul des bobinages pour la production de champs magnetiques tres homogenes

    Energy Technology Data Exchange (ETDEWEB)

    Desportes, H [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    The discovery of type II superconductors has considerably increased the possibilities of air-core coils, in particular with regard to the production of high homogeneous fields. The design of such magnets,calls for elaborate calculations which, in practise, can only be carried out on computers. The present report describes a complete set of programs for the calculation, in the case of cylindrical systems, of the magnetic field components at any point, the lines of flux, the forces, the self and mutual inductances, as well as the design of compensated coils for the production of high homogeneous fields. These programs have been employed for the calculation of two magnets which are described in detail. (author) [French] L'interet des bobines sans fer s'est considerablement accru depuis l'apparition recente des supraconducteurs de la deuxieme espece, en particulier pour la realisation d'aimants a champ tres homogene. Le calcul de tels bobinages fait appel a des methodes complexes dont l'execution pratique necessite l'emploi de machines a calculer. Le present rapport decrit un ensemble de programmes permettant de calculer, dans le cas de systemes de revolution de structure quelconque, le champ dans tout l'espace, les lignes de force du champ, les efforts electromagnetiques, les selfs et mutuelles, et de determiner des enroulements de compensation destines a uniformiser le champ. Ces programmes ont servi au calcul de deux aimants particuliers dont les caracteristiques detaillees sont fournies a titre d'application. (auteur)