WorldWideScience

Sample records for homogenization modelisation numerique

  1. Modelisation numerique et validation experimentale d'un systeme de protection contre le givre par elements piezoelectriques

    Science.gov (United States)

    Harvey, Derek

    Le degivrage au moyen d'actuateurs piezoelectriques est considere comme une avenue prometteuse pour le developpement de systemes a faible consommation d'energie applicables aux helicopteres legers. Ce type de systeme excite des frequences de resonances d'une structure pour produire des deformations suffisantes pour rompre l'adherence de la glace. Par contre, la conception de tel systeme demeure generalement mal comprise. Ce projet de maitrise etudie l'utilisation de methodes numeriques pour assister la conception des systemes de protection contre le givre a base d'elements piezoelectriques. La methodologie retenue pour ce projet a ete de modeliser differentes structures simples et de simuler l'excitation harmonique des frequences de resonance au moyen d'actuateurs piezoelectriques. Le calcul des frequences de resonances ainsi que la simulation de leur excitation a ensuite ete validee a l'aide de montages experimentaux. La procedure a ete realisee pour une poutre en porte-a-faux et pour une plaque plane a l'aide du logiciel de calcul par elements finis, Abaqus. De plus, le modele de la plaque plane a ete utilise afin de realiser une etude parametrique portant sur le positionnement des actuateurs, l'effet de la rigidite ainsi que de l'epaisseur de la plaque. Finalement, la plaque plane a ete degivree en chambre climatique. Des cas de degivrage ont ete simules numeriquement afin d'etudier la possibilite d'utiliser un critere base sur la deformation pour predire le succes du systeme. La validation experimentale a confirme la capacite du logiciel a calculer precisement a la fois les frequences et les modes de resonance d'une structure et a simuler leur excitation par des actuateurs piezoelectriques. L'etude revele que la definition de l'amortissement dans le modele numerique est essentiel pour l'obtention de resultats precis. Les resultats de l'etude parametrique ont demontre l'importance de minimiser l'epaisseur et la rigidite afin de reduire la valeur des frequences

  2. New modelling method for fast reactor neutronic behaviours analysis; Nouvelles methodes de modelisation neutronique des reacteurs rapides de quatrieme Generation

    Energy Technology Data Exchange (ETDEWEB)

    Jacquet, P.

    2011-05-23

    Due to safety rules running on fourth generation reactors' core development, neutronics simulation tools have to be as accurate as never before. First part of this report enumerates every step of fast reactor's neutronics simulation implemented in current reference code: ECCO. Considering the field of fast reactors that meet criteria of fourth generation, ability of models to describe self-shielding phenomenon, to simulate neutrons leakage in a lattice of fuel assemblies and to produce representative macroscopic sections is evaluated. The second part of this thesis is dedicated to the simulation of fast reactors' core with steel reflector. These require the development of advanced methods of condensation and homogenization. Several methods are proposed and compared on a typical case: the ZONA2B core of MASURCA reactor. (author) [French] Les criteres de surete qui regissent le developpement de coeurs de reacteurs de quatrieme generation implique l'usage d'outils de calcul neutronique performants. Une premiere partie de la these reprend toutes les etapes de modelisation neutronique des reacteurs rapides actuellement d'usage dans le code de reference ECCO. La capacite des modeles a decrire le phenomene d'autoprotection, a representer les fuites neutroniques au niveau d'un reseau d'assemblages combustibles et a generer des sections macroscopiques representatives est appreciee sur le domaine des reacteurs rapides innovants respectant les criteres de quatrieme generation. La deuxieme partie de ce memoire se consacre a la modelisation des coeurs rapides avec reflecteur acier. Ces derniers necessitent le developpement de methodes avancees de condensation et d'homogenisation. Plusieurs methodes sont proposees et confrontees sur un probleme de modelisation typique: le coeur ZONA2B du reacteur maquette MASURCA

  3. Etude de pratiques d'enseignement relatives a la modelisation en sciences et technologies avec des enseignants du secondaire

    Science.gov (United States)

    Aurousseau, Emmanuelle

    Les modeles sont des outils amplement utilises en sciences et technologies (S&T) afin de representer et d’expliquer un phenomene difficilement accessible, voire abstrait. La demarche de modelisation est presentee de maniere explicite dans le programme de formation de l’ecole quebecoise (PFEQ), notamment au 2eme cycle du secondaire (Quebec. Ministere de l'Education du Loisir et du Sport, 2007a). Elle fait ainsi partie des sept demarches auxquelles eleves et enseignants sont censes recourir. Cependant, de nombreuses recherches mettent en avant la difficulte des enseignants a structurer leurs pratiques d’enseignement autour des modeles et de la demarche de modelisation qui sont pourtant reconnus comme indispensables. En effet, les modeles favorisent la conciliation des champs concrets et abstraits entre lesquels le scientifique, meme en herbe, effectue des allers-retours afin de concilier le champ experimental de reference qu’il manipule et observe au champ theorique relie qu’il construit. L’objectif de cette recherche est donc de comprendre comment les modeles et la demarche de modelisation contribuent a faciliter l’articulation du concret et de l’abstrait dans l’enseignement des sciences et des technologies (S&T) au 2eme cycle du secondaire. Pour repondre a cette question, nous avons travaille avec les enseignants dans une perspective collaborative lors de groupes focalises et d’observation en classe. Ces dispositifs ont permis d’examiner les pratiques d’enseignement que quatre enseignants mettent en oeuvre en utilisant des modeles et des demarches de modelisation. L’analyse des pratiques d’enseignement et des ajustements que les enseignants envisagent dans leur pratique nous permet de degager des connaissances a la fois pour la recherche et pour la pratique des enseignants, au regard de l’utilisation des modeles et de la demarche de modelisation en S&T au secondaire.

  4. Modelling of the dynamical behaviour of LWR internals by homogeneization methods

    International Nuclear Information System (INIS)

    Brochard, D.; Lepareux, M.; Gibert, R.J.; Delaigue, D.; Planchard, J.

    1987-01-01

    The upper plenum of the internals of PWR, the steam generator bundle, the nuclear reactor core, may be schematically represented by a beam bundle immersed in a fluid. The dynamical study of such a system needs to take into account fluid structure interaction. A refined modelisation at the scale of the tubes can be used but leads to a very important size of problem difficult to solve even on the biggest computers. The homogeneization method allows to have an approximation of the fluid structure interaction for the global behaviour of the bundle. It consists in replacing the heterogeneous physical medium (tubes and fluid) by an equivalent homogeneous medium whose characteristics are determined from the resolution of a set of problems on the elementary cell. The aim of this paper is to present the main steps of the determination of this equivalent medium in the case of small displacements (acoustic behaviour of the fluid) and in using displacement variables for both fluid and tubes. Then some precisions about the implementation of this method in computer codes will be given. (orig.)

  5. Flow and heat transfer thermohydraulic modelisation during the reflooding phase of a P.W.R.'s core

    International Nuclear Information System (INIS)

    Raymond, Patrick

    1978-04-01

    Some generalities about L.O.C.A. are first recalled. The French experimental studies about Emergency Core Cooling System are briefly described. The different heat transfer mechanisms to take into account, according to the flow pattern in the dry zone, and the correlations or methods to calculate them, are defined. Then the Thermohydraulic code computer: FLIRA, which describe the reflooding phase, and a modelisation taking into account the different flow patterns are setted. A first interpretation of ERSEC experiments with a tubular test section shows that it is possible, with this modelisation and some classical heat transfer correlations, to describe the reflooding phase. [fr

  6. Experimental study and modelisation of a pulse tube refrigerator

    International Nuclear Information System (INIS)

    Ravex, A.; Rolland, P.; Liang, J.

    1992-01-01

    A test bench for pulse tube refrigerator characterization has been built. In various configurations (basic pulse tube, orifice pulse tube and double inlet pulse tube), the ultimate temperature and the cooling power have been measured as a function of pressure wave amplitude and frequency for various geometries. A lowest temperature of 28 K has been achieved in a single staged double inlet configuration. A modelisation taking into account wall heat pumping, enthalpy flow and regenerator inefficiency is under development. Preliminary calculation results are compared with experimental data

  7. Thermal tests on UF6 containers and valves modelisation and extrapolation on real fire situations

    International Nuclear Information System (INIS)

    Duret, B.; Warniez, P.

    1988-12-01

    From realistic tests on containers or on valves, we propose a modelisation which we apply to 3 particular problems: resistance of a 48 Y containers, during a fire situation. Influence of the presence of a valve. Evaluation of a leakage through a breach, mechanically created before a fire

  8. Developpement D'un Modele Climatique Regional: Fizr Simulation des Conditions de Janvier de la Cote Ouest Nord Americaine

    Science.gov (United States)

    Goyette, Stephane

    1995-11-01

    Le sujet de cette these concerne la modelisation numerique du climat regional. L'objectif principal de l'exercice est de developper un modele climatique regional ayant les capacites de simuler des phenomenes de meso-echelle spatiale. Notre domaine d'etude se situe sur la Cote Ouest nord americaine. Ce dernier a retenu notre attention a cause de la complexite du relief et de son controle sur le climat. Les raisons qui motivent cette etude sont multiples: d'une part, nous ne pouvons pas augmenter, en pratique, la faible resolution spatiale des modeles de la circulation generale de l'atmosphere (MCG) sans augmenter a outrance les couts d'integration et, d'autre part, la gestion de l'environnement exige de plus en plus de donnees climatiques regionales determinees avec une meilleure resolution spatiale. Jusqu'alors, les MCG constituaient les modeles les plus estimes pour leurs aptitudes a simuler le climat ainsi que les changements climatiques mondiaux. Toutefois, les phenomenes climatiques de fine echelle echappent encore aux MCG a cause de leur faible resolution spatiale. De plus, les repercussions socio-economiques des modifications possibles des climats sont etroitement liees a des phenomenes imperceptibles par les MCG actuels. Afin de circonvenir certains problemes inherents a la resolution, une approche pratique vise a prendre un domaine spatial limite d'un MCG et a y imbriquer un autre modele numerique possedant, lui, un maillage de haute resolution spatiale. Ce processus d'imbrication implique alors une nouvelle simulation numerique. Cette "retro-simulation" est guidee dans le domaine restreint a partir de pieces d'informations fournies par le MCG et forcee par des mecanismes pris en charge uniquement par le modele imbrique. Ainsi, afin de raffiner la precision spatiale des previsions climatiques de grande echelle, nous developpons ici un modele numerique appele FIZR, permettant d'obtenir de l'information climatique regionale valide a la fine echelle spatiale

  9. Modelisation mathematique et numerique d'un capteur stockeur d ...

    African Journals Online (AJOL)

    LJ.o ufo. .,.i.(u-a.,...ll c;..c. ~ ..l,!1ji:i. 4.1..it.....!1 I~. ":~ .!.i:.....ill 11\\ . _j •• ~.11. • L)I :i -_,11 • --11 . • • . • 4.IY.l.I ..... ~!/ . __ 11;.,... ·~1 ..:..l.i.i.bl1 Ul:io....l ..l - __ 11. l.s- J-' .

  10. An homogeneization method applied to the seismic analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Brochard, D.; Hammami, L.

    1991-01-01

    Important structures like nuclear reactor cores, steam generator bundle, are schematically composed by a great number of beams, immersed in a fluid. The fluid structure interaction is an important phenomenon influencing the dynamical response of bundle. The study of this interaction through classical methods would need a refined modelisation at the scale of the beams and lead to important size of problems. The homogeneization method constitutes an alternative approach if we are mainly interested by the global behaviour of the bundle. Similar approaches have been already used for other types of industrial structures (Sanchez-Palencia 1980, Bergman and al. 1985, Theodory 1984, Benner and al. 1981). This method consists in replacing the physical heterogeneous medium by an homogeneous medium, which characteristics are determined from the resolution of a set problems on the elementary cell. In the first part of this paper the main assumptions of the method will be summarized. Moreover, other important phenomena may contribute to the dynamical behaviour of the industrial above mentioned structures: those are the impacts between the beams. These impacts could be due to supports limiting the displacements of the beams or to differences in the vibratory characteristics of the various beams. The second part of the paper will concern the way of taking into account the impacts in the linear hemogeneous formalism. Finally an application to the seismic analysis of the FBR core mock-up RAPSODIE will be presented

  11. Type II supernovae modelisation: neutrinos transport simulation

    International Nuclear Information System (INIS)

    Mellor, P.

    1988-10-01

    A modelisation of neutrino transport in type II supernovae is presented. The first part is a description of hydrodynamics and radiative processes responsible of supernovae explosions. Macroscopic aspects of these are displayed in part two. Neutrino transport theory and usual numerical methods are also developed. A new technic of coherent scattering of neutrinos on nuclei or free nucleons is proposed in the frame work of the Lorentz bifluid approximation. This method deals with all numerical artifices (flux limiting schemes, closure relationship of Eddington moments) and allows a complete and consistent determination of the time-dependent neutrino distribution function for any value of the opacity, gradient of opacity and for all (relativistic) velocity fields of the diffusive medium. Part three is dedicated to microscopic phenomena (electronic capture, chimical composition, etc) which rule neutrinos emission-absorption mechanisms. The numerical treatments of those are presented, and some applications are useful for their parametrization. Finally, an extension of the method to inelastic scattering on light particules (electrons) is described in view to study neutrinos thermalization mechanism [fr

  12. Etude aerodynamique d'un jet turbulent impactant une paroi concave

    Science.gov (United States)

    LeBlanc, Benoit

    , tridimensionnel. Les nombres de Reynolds utilises dans l'etude numerique, bases sur le diametre du jet lineaire observe, sont de Red = 3333 et 6667, consideres comme etant en transition vers la turbulence. Dans cette etude, un montage numerique est construit. Le maillage, le schema numerique, les conditions frontiere et la discretisation sont discutes et choisis. Les resultats sont ensuite valides avec des donnees turbulentes experimentales. En modelisation numerique de turbulence, les modeles de Moyennage Reynolds des Equations Naviers Stokes (RANS) presentent des difficultes avec des ecoulements instationnaires en regime transitionnel. La Simulation des Grandes Echelles (LES) presente une solution plus precise, mais au cout encore hors de portee pour cette etude. La methode employee pour cette etude est la Simulation des Tourbillons Detaches (DES), qui est un hybride des deux methodes (RANS et LES). Pour analyser la topologie de l'ecoulement, la decomposition des modes propres (POD) a ete egalement ete effectuee sur les resultats numeriques. L'etude a demontre d'abord le temps de calcul relativement eleve associe a des essais DES pour garder le nombre de Courant faible. Les resultats numeriques ont cependant reussi a reproduire correctement le basculement asynchrone observe dans les essais experimentaux. Le basculement observe semble etre cause par des effets transitionnels, ce qui expliquerait la difficulte des modeles RANS a correctement reproduire l'aerodynamique de l'ecoulement. L'ecoulement du jet, a son tour, est pour la plupart du temps tridimensionnel et turbulent sauf pour de courtes periodes de temps stable et independant de la troisieme dimension. L'etude topologique de l'ecoulement a egalement permit la reconaissances de structures principales sousjacentes qui etaient brouillees par la turbulence. Mots cles : jet impactant, paroi concave, turbulence, transitionnel, simulation des tourbillons detaches (DES), OpenFOAM.

  13. Modelisation of the concentration of macromolecules moving in a Newtonian fluid

    International Nuclear Information System (INIS)

    Hijazi, A.; Zoaeter, M.; Khater, A.; Aussere, D.

    1998-01-01

    Author.This article presents a modelisation of the distribution of a diluted solution of macromolecules submitted to a simple flow in the neighborhood of a non-absorbing solid surface. These macromolecules (length L, negligible diameter) are submitted to two kinds of forces: rotational and translational with brownian and hydrodynamic origins. The evolution of orientation of these molecules in terms of time has been studied, given Einstein equation =D with D coefficient of translation and rotation. By taking as parameters the orientation θ of the macromolecules with respect to an horizontal axis and Z the distance between these macromolecules and the surface, a statistical study has led to determine the distribution. For that reason, the brownian movement considered is supposed to follow a rule of random probability

  14. Developpements numeriques recents realises en aeroelasticite chez Dassault Aviation pour la conception des avions de combat modernes et des avions d’affaires

    Science.gov (United States)

    2003-03-01

    Cost through Advanced Modelling and Virtual Simulation [La reduction des couts et des delais d’acquisition des vehicules militaires par la modelisation...sont les 6quations de restitution, par le mod~e, des frdquences et des amortissements des modes adrodlastiques mesurds h une prdcision F- donnde. Afin... amortissements mesurds h 37800 Pa et 60000 Pa (points nettemnent inferieurs A la vitesse critique). Comme le montre ce diagramme, le calcul, recal6 h

  15. Metrological characterization of the numerical system Adonis for gamma spectrometry; Caracterisation metrologique du systeme de spectrometrie gamma numerique Adonis

    Energy Technology Data Exchange (ETDEWEB)

    Plagnard, J.; Morel, J.; Tran Tuan, A

    2005-07-01

    In gamma spectrometry, new acquisition systems based on digital processing of the signals are now available on the market. In order to determine their performances at high count rates, The CEA-LNHB (Commissariat a l'Energie Atomique - Laboratoire National Henri Becquerel) has tested several of these equipments.. These tests have clearly shown that the performances announced by the manufacturers were generally not met. At this point, it was interesting to include in these tests, the system ADONIS (Atelier de Developpement Numerique pour l'Instrumentation en Spectrometrie), which is the new gamma spectrometry system, developed by the CEA-SIAR (Service d'Instrumentation et d'Application des Rayonnements). (authors)

  16. Direct digital control of furnaces irradiated in nuclear reactors; Surveillance et regulation multiplexee par calculateur numerique de fours irradies

    Energy Technology Data Exchange (ETDEWEB)

    Joumard, R. [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1969-07-01

    An experimental direct digital control system has been realised in the 'C.E.N.G.', in order to verify that a computer makes easier the control of the experiments done in the nuclear reactors and to solve the theoretical and technical difficulties. The regulation is applied to thermal processes. The sampled data systems theory permits to choose the type of an efficient and simple digital compensator, and to establish a diagram which gives the values of the correcting parameters (obtained by minimizing the difference between the output and the input when perturbations occur). The programme execute, in simultaneity, supervision and regulation. Complex possibilities of printing out measures and alarms existed. The computer works out an incremental correction which makes step motors to turn. These motors act on the heating organs. The theoretical values and answers have been confirmed. The accuracy was limited essentially by the input quantification (1/1000 th). The comfort of such a system has been noticeable. (author) [French] Une installation de controle numerique direct fut realisee a titre experimental au C.E.N.G pour verifier qu'un ordinateur rendait plus aisee l'exploitation des experiences faites en pile nucleaire et pour degager les difficultes theoriques et techniques. La regulation s'applique a des processus thermiques. La theorie des systemes echantillonnes a permis de choisir un type de correcteur numerique simple et efficace et d'etablir un abaque qui donne les valeurs des parametres correcteurs minimisant les ecarts enregistres entre la reponse et la consigne en presence de perturbations. Le programme effectuait simultanement de la surveillance et de la regulation. Une restitution complexe des informations et des alarmes sur machine a ecrire etait possible. Le calculateur elaborait une correction incrementielle qui faisait tourner des moteurs pas a pas, lesquels commandaient les organes de puissance de chauffage. Les valeurs

  17. Direct digital control of furnaces irradiated in nuclear reactors; Surveillance et regulation multiplexee par calculateur numerique de fours irradies

    Energy Technology Data Exchange (ETDEWEB)

    Joumard, R [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1969-07-01

    An experimental direct digital control system has been realised in the 'C.E.N.G.', in order to verify that a computer makes easier the control of the experiments done in the nuclear reactors and to solve the theoretical and technical difficulties. The regulation is applied to thermal processes. The sampled data systems theory permits to choose the type of an efficient and simple digital compensator, and to establish a diagram which gives the values of the correcting parameters (obtained by minimizing the difference between the output and the input when perturbations occur). The programme execute, in simultaneity, supervision and regulation. Complex possibilities of printing out measures and alarms existed. The computer works out an incremental correction which makes step motors to turn. These motors act on the heating organs. The theoretical values and answers have been confirmed. The accuracy was limited essentially by the input quantification (1/1000 th). The comfort of such a system has been noticeable. (author) [French] Une installation de controle numerique direct fut realisee a titre experimental au C.E.N.G pour verifier qu'un ordinateur rendait plus aisee l'exploitation des experiences faites en pile nucleaire et pour degager les difficultes theoriques et techniques. La regulation s'applique a des processus thermiques. La theorie des systemes echantillonnes a permis de choisir un type de correcteur numerique simple et efficace et d'etablir un abaque qui donne les valeurs des parametres correcteurs minimisant les ecarts enregistres entre la reponse et la consigne en presence de perturbations. Le programme effectuait simultanement de la surveillance et de la regulation. Une restitution complexe des informations et des alarmes sur machine a ecrire etait possible. Le calculateur elaborait une correction incrementielle qui faisait tourner des moteurs pas a pas, lesquels commandaient les organes de puissance de chauffage. Les valeurs et les reponses theoriques ont

  18. Study and achievement of a digital-analog-divider; Etude et realisation d'un diviseur-analogique-numerique

    Energy Technology Data Exchange (ETDEWEB)

    Petin, A [Commissariat a l' Energie Atomique, Cadarache (France). Centre d' Etudes Nucleaires

    1969-04-01

    This apparatus is designed to give directly, in digital form, the value of the ratio Vt1/V2 two analog voltages. It consists essentially of an analog-digital coder operating by successive weighing; the comparison voltage is made proportional to the divider V2 in the coder. The input dynamics are such that the voltages Vi and V2 are all in the range -50 mV to -5 V. Each of the circuits has an input impedance of about 10 K{omega}. As for the quotient, it is a binary number given in series and parallel form; it is made up of 8 bits, this giving a change of 1/16 to 16 per jump of 1/16 in the zone where the accuracy is highest (V2 {>=} 800 mV). The time required for a division is, at best, 15 {mu}sec. During the time of calculation, the voltages V{sub 1} and V{sub 2} should not vary by more than 1 per cent and 0.5 per cent respectively. The theory of the system and the investigation of a synoptic diagram, the study of the circuits and the actual construction are presented. (author) [French] Cet appareil est destine a fournir directement sous forme numerique la valeur du rapport V1/V2 de deux tensions analogiques. Il est constitue essentiellement d'un codeur analogique-numerique fonctionnant par pesees successives dans lequel la tension de reference est rendue proportionnelle au diviseur V2. La dynamique d'entree est telle que les tensions V1 et V2 peuvent etre comprises dans l'intervalle -50 mV a -5 V. Chacune des voies presente une impedance d'entree d'environ 10 K{omega}. En ce qui concerne le quotient, c'est un nombre binaire delivre sous les formes serie et parallele ; il est compose de 8 bits, ce qui donne une variation de 1/16 a 16 par bond de 1/16 dans la zone de meilleure precision (V2 {>=} 800 mV). Le temps necessaire pour effectuer la division est au mieux de 15 {mu}s. Durant le temps de calcul les tensions V{sub 1} et V{sub 2} ne doivent pas varier respectivement de plus de 1 pour cent et 0.5 pour cent. Apres avoir etabli la theorie du systeme, les

  19. Calculation of reactivity by digital processing; Calcul de la reactivite par traitement numerique

    Energy Technology Data Exchange (ETDEWEB)

    Hedde, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-12-01

    With a view to exploring the new possibilities offered by digital techniques, a description is given of the optimum theoretical conditions of a computation of the realtime reactivity using counting samples (obtained from a nuclear reactor). The degree to which these optimum conditions can be attained depends on the complexity of the processing which can be accepted. A compromise thus has to be made between the accuracy required and the simplicity of the equipment carrying out the processing. An example is given, using a relatively simple structure, which gives an idea of the accuracy of the results obtained over a wide range of reactor power. (author) [French] Dans le but d'explorer les possibilites nouvelles des techniques numeriques, on decrit les conditions theoriques optimales d'un calcul de la reactivite en temps reel a partir d'echantillons de comptage (en provenance d'un reacteur nucleaire). Ces conditions optimales peuvent etre approchees d'autant mieux que l'on accepte un traitement plus complexe. Un compromis est donc a faire entre la precision desiree et la simplicite du materiel assurant le traitement. Un exemple adoptant une structure de complexite reduite permet de juger de la precision des resultats obtenus sur une importante plage d'evolution de la puissance. (auteur)

  20. Calculation of reactivity by digital processing; Calcul de la reactivite par traitement numerique

    Energy Technology Data Exchange (ETDEWEB)

    Hedde, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-12-01

    With a view to exploring the new possibilities offered by digital techniques, a description is given of the optimum theoretical conditions of a computation of the realtime reactivity using counting samples (obtained from a nuclear reactor). The degree to which these optimum conditions can be attained depends on the complexity of the processing which can be accepted. A compromise thus has to be made between the accuracy required and the simplicity of the equipment carrying out the processing. An example is given, using a relatively simple structure, which gives an idea of the accuracy of the results obtained over a wide range of reactor power. (author) [French] Dans le but d'explorer les possibilites nouvelles des techniques numeriques, on decrit les conditions theoriques optimales d'un calcul de la reactivite en temps reel a partir d'echantillons de comptage (en provenance d'un reacteur nucleaire). Ces conditions optimales peuvent etre approchees d'autant mieux que l'on accepte un traitement plus complexe. Un compromis est donc a faire entre la precision desiree et la simplicite du materiel assurant le traitement. Un exemple adoptant une structure de complexite reduite permet de juger de la precision des resultats obtenus sur une importante plage d'evolution de la puissance. (auteur)

  1. Numerical modeling of the thermomechanical behavior of networks of underground galleries for the storage of the radioactive waste: approach by homogenization; Modelisation numerique du comportement thermomecanique de reseaux de galeries souterraines pour le stockage des dechets radioactifs: Approche par homogeneisation

    Energy Technology Data Exchange (ETDEWEB)

    Zokimila, P

    2005-10-15

    Deep geological disposal is one of the privileged options for the storage of High Level radioactive waste. A good knowledge of the behavior and properties of the potential geological formations as well as theirs evolution in time under the effect of the stress change induced by a possible installation of storage is required. The geological formation host will be subjected to mechanical and thermal solicitations due respectively to the excavation of the disposal tunnels and the release of heat of the canisters of radioactive waste. These thermomechanical solicitations will generate a stress relief in the host layer and disposal tunnels deformations as well as the extension of the damaged zones (EDZ) could cause local and global instabilities. This work aims to develop calculation methods to optimize numerical modeling of the thermoelastic behavior of the disposal at a large scale and to evaluate thermomechanical disturbance induced by storage on the geological formation host. Accordingly, after a presentation of the state of knowledge on the thermomechanical aspects of the rocks related to deep storage, of numerical modeling 2D and 3D of the thermoelastic behavior of individual disposal tunnel and a network of tunnels were carried out by a discrete approach. However, this classical approach is penalizing to study the global behavior of disposal storage. To mitigate that, an approach of numerical modeling, based on homogenization of periodic structures, was proposed. Formulations as numerical procedures were worked out to calculate the effective thermoelastic behavior of an equivalent heterogeneous structure. The model, obtained by this method, was validated with existing methods of homogenization such as the self-consistent model, as well as the Hashin-Shtrikman bounds. The comparison between the effective thermoelastic behavior and current thermoelastic behavior of reference showed a good coherence of the results. For an application to deep geological storage, the

  2. Computation of 2D compressible flows with a finite element method

    International Nuclear Information System (INIS)

    Montagne, J.L.

    1981-04-01

    When the homogeneous modelisation of the two phase flow is used the set of equations describing the flow is similar to an Euler system. Mixed finite elements are appropriate to discretize the equations. First, main properties of this kind of elements are reminded. Then, some properties of semi-implicite schemes on stability and entropy are given. Numerical tests have been performed, and the scheme gave satisfactory results

  3. Study and achievement of a digital-analog-divider; Etude et realisation d'un diviseur-analogique-numerique

    Energy Technology Data Exchange (ETDEWEB)

    Petin, A. [Commissariat a l' Energie Atomique, Cadarache (France). Centre d' Etudes Nucleaires

    1969-04-01

    This apparatus is designed to give directly, in digital form, the value of the ratio Vt1/V2 two analog voltages. It consists essentially of an analog-digital coder operating by successive weighing; the comparison voltage is made proportional to the divider V2 in the coder. The input dynamics are such that the voltages Vi and V2 are all in the range -50 mV to -5 V. Each of the circuits has an input impedance of about 10 K{omega}. As for the quotient, it is a binary number given in series and parallel form; it is made up of 8 bits, this giving a change of 1/16 to 16 per jump of 1/16 in the zone where the accuracy is highest (V2 {>=} 800 mV). The time required for a division is, at best, 15 {mu}sec. During the time of calculation, the voltages V{sub 1} and V{sub 2} should not vary by more than 1 per cent and 0.5 per cent respectively. The theory of the system and the investigation of a synoptic diagram, the study of the circuits and the actual construction are presented. (author) [French] Cet appareil est destine a fournir directement sous forme numerique la valeur du rapport V1/V2 de deux tensions analogiques. Il est constitue essentiellement d'un codeur analogique-numerique fonctionnant par pesees successives dans lequel la tension de reference est rendue proportionnelle au diviseur V2. La dynamique d'entree est telle que les tensions V1 et V2 peuvent etre comprises dans l'intervalle -50 mV a -5 V. Chacune des voies presente une impedance d'entree d'environ 10 K{omega}. En ce qui concerne le quotient, c'est un nombre binaire delivre sous les formes serie et parallele ; il est compose de 8 bits, ce qui donne une variation de 1/16 a 16 par bond de 1/16 dans la zone de meilleure precision (V2 {>=} 800 mV). Le temps necessaire pour effectuer la division est au mieux de 15 {mu}s. Durant le temps de calcul les tensions V{sub 1} et V{sub 2} ne doivent pas varier respectivement de plus de 1 pour cent et 0.5 pour cent. Apres avoir etabli la

  4. Modeling of acoustic wave propagation and scattering for telemetry of complex structures; Modelisation de la propagation et de l'interaction d'une onde acoustique pour la telemetrie de structures complexes

    Energy Technology Data Exchange (ETDEWEB)

    LU, B.

    2011-11-07

    milieu homogene moyen en modifiant les temps de parcours des rayons homogenes par incorporation d'une correction fournie par le modele stochastique. Le modele stochastique de propagation ainsi developpe a ete valide par comparaison avec un modele deterministe et s'avere nettement plus simple a mettre en oeuvre au sein de la plateforme logicielle de simulation en controle non destructif CIVA et moins couteux en temps de calcul que le modele deterministe. En vue de modeliser l'interaction onde acoustique/cible, des modeles classiques de diffraction ont ete evalues dans le cadre de structures rigides, parmi lesquels la theorie geometrique de la diffraction (GTD) et l'approximation de Kirchhoff (KA), ces deux approches apparaissant comme complementaires. En les combinant de sorte a ne conserver que leurs avantages, nous avons developpe un modele hybride (KA raffine) en utilisant une procedure similaire a la theorie physique de la diffraction (PTD). Le modele KA raffine fournit une amelioration de la prediction en champ proche d'une cible rigide. Le modele de diffraction KA initial (non raffine) a ete ensuite etendu pour traiter une cible realiste d'impedance finie. Le modele KA 'general' ainsi obtenu se revele etre une solution satisfaisante pour l'application a la telemetrie. Finalement, le couplage du modele stochastique de propagation et du modele de diffraction KA general nous a permis de construire un outil de simulation complete de la telemetrie en milieu inhomogene. (auteur)

  5. The relationship between continuum homogeneity and statistical homogeneity in cosmology

    International Nuclear Information System (INIS)

    Stoeger, W.R.; Ellis, G.F.R.; Hellaby, C.

    1987-01-01

    Although the standard Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe models are based on the concept that the Universe is spatially homogeneous, up to the present time no definition of this concept has been proposed that could in principle be tested by observation. Such a definition is here proposed, based on a simple spatial averaging procedure, which relates observable properties of the Universe to the continuum homogeneity idea that underlies the FLRW models. It turns out that the statistical homogeneity often used to describe the distribution of matter on a large scale does not imply spatial homogeneity according to this definition, and so cannot be simply related to a FLRW Universe model. Values are proposed for the homogeneity parameter and length scale of homogeneity of the Universe. (author)

  6. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    Science.gov (United States)

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  7. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    Science.gov (United States)

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  8. 7 CFR 58.920 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.920 Section 58.920 Agriculture... Procedures § 58.920 Homogenization. Where applicable concentrated products shall be homogenized for the... homogenization and the pressure at which homogenization is accomplished will be that which accomplishes the most...

  9. Numerical modelling of coupled phenomena within molten glass heated by induction and mechanically stirred; Modelisation numerique de phenomenes couples dans des bains de verre brasses mecaniquement et elabores en creuset froid inductif

    Energy Technology Data Exchange (ETDEWEB)

    Jacoutot, L

    2006-11-15

    This study reports on a new vitrification process developed by the French Atomic Energy Commission (CEA, Marcoule). This process is used for the treatment of high activity nuclear waste. It is characterized by the cooling of all the metal walls and by currents directly induced inside the molten glass. In addition, a mechanical stirring device is used to homogenize the molten glass. The goal of this study is to develop numerical tools to understand phenomena which take place within the bath and which involve thermal, hydrodynamic and electromagnetic aspects. The numerical studies are validated using experimental results obtained from pilot vitrification facilities. (author)

  10. Homogenization of Mammalian Cells.

    Science.gov (United States)

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  11. Functionality and homogeneity.

    NARCIS (Netherlands)

    2011-01-01

    Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass,

  12. Homogenization of resonant chiral metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....

  13. Homogeneous crystal nucleation in polymers.

    Science.gov (United States)

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  14. Numerical modeling of transferred arc melting bath heating; Modelisation numerique du chauffage de bains par arc transfere

    Energy Technology Data Exchange (ETDEWEB)

    Bouvier, A. [Electricite de France, 77 - Moret sur Loing (France). Direction des Etudes et Recherches; Trenty, L.; Guillot, J.B. [Ecole Centrale de Paris, Laboratoire EM2C. CNRS, 92 - Chatenay-Malabry (France); Delalondre, C. [Electricite de France (EDF), 78 - Chatou (France). Direction des Etudes et Recherches

    1997-12-31

    This paper presents the modeling of a transferred electric arc inside a bath of melted metal. After a recall of the context of the study, the problem of the modeling, which involves magnetohydrodynamic coupling inside the arc and the bath, is described. The equations that govern the phenomena inside the arc and the bath are recalled and the approach used for the modeling of the anode region of the arc is explained using a 1-D sub-model. The conditions of connection between arc and bath calculations are explained and calculation results obtained with a 200 kW laboratory furnace geometry are presented. (J.S.) 8 refs.

  15. Feasibility Study of Aseptic Homogenization: Affecting Homogenization Steps on Quality of Sterilized Coconut Milk

    Directory of Open Access Journals (Sweden)

    Phungamngoen Chanthima

    2016-01-01

    Full Text Available Coconut milk is one of the most important protein-rich food sources available today. Separation of an emulsion into an aqueous phase and cream phase is commonly occurred and this leads an unacceptably physical defect of either fresh or processed coconut milk. Since homogenization steps are known to affect the stability of coconut milk. This work was aimed to study the effect of homogenization steps on quality of coconut milk. The samples were subject to high speed homogenization in the range of 5000-15000 rpm under sterilize temperatures at 120-140 °C for 15 min. The result showed that emulsion stability increase with increasing speed of homogenization. The lower fat particles were generated and easy to disperse in continuous phase lead to high stability. On the other hand, the stability of coconut milk decreased, fat globule increased, L value decreased and b value increased when the high sterilization temperature was applied. Homogenization after heating led to higher stability than homogenization before heating due to the reduced particle size of coconut milk after aggregation during sterilization process. The results implied that homogenization after sterilization process might play an important role on the quality of the sterilized coconut milk.

  16. 7 CFR 58.636 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.636 Section 58.636 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.636 Homogenization. Homogenization of the pasteurized mix shall be accomplished to...

  17. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  18. Value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations

    Directory of Open Access Journals (Sweden)

    Luo Li-Qin

    2016-01-01

    Full Text Available In this paper, we investigate the value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations, and obtain the results on the relations between the order of the solutions and the convergence exponents of the zeros, poles, a-points and small function value points of the solutions, which show the relations in the case of non-homogeneous equations are sharper than the ones in the case of homogeneous equations.

  19. Study of problems arising from the use of thermal neutron detectors in a pulsed regime. Application to the development of a digital transferometer adapted to receive signals from these detectors; Etude des problemes poses par l'utilisation des detecteurs de neutrons thermiques fonctionnant en regime impulsionnel. Application a la realisation d'un transferometre numerique adapte aux signaux fournis par ces detecteurs

    Energy Technology Data Exchange (ETDEWEB)

    Le Tilly, Y [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-12-01

    The instantaneous value of the counting rate of the pulses given by a fission detector settled in a reactor follows the neutron flux, but it is shown that the counter adds a white noise to the measured signal. This report deals with some possibilities of on line numerical handling afforded by this kind of signals. One considers first the influence of a by N numerical divider and one shows that, acting like a quantifier, it adds to the signal a white noise with the power N{sup 2}/{sub 12}. One, studies afterwards the principle of a digital filter aimed to Fourier analyse the signal. The realization of this device is described. It can be used in transfer function measurements at frequencies below 125 kHz. Some examples of experiments performed with this apparatus are presented. One discusses finally the design, according to the same principle, of a power spectral density analyser in the frequency range 0,01 - 10 000 Hz for random signal of the same kind. (author) [French] La valeur instantanee de la frequence de recurrence des impulsions issues d'un detecteur a fission place dans un reacteur est proportionnelle au flux neutronique. Apres avoir montre que le detecteur ajoute un bruit blanc au signal mesure, on etudie clans ce rapport certaines possibilites de traitement numerique en temps reel offertes par ce type de signaux. On examine d'abord l'influence d'un diviseur numerique par N, et l'on montre que son action, semblable a une quantification, ajoute au signal un bruit blanc de puissance N{sup 2}/{sub 12}. On, etudie ensuite le principe d'un filtre numerique destine a effectuer l'analyse de Fourier du signal, et l'on decrit la realisation de cet appareil qui peut etre utilise pour mesurer des fonctions de transfert a une frequence quelconque inferieure a 10 kHz. Des exemples de mesures faites avec cet appareil sont presentes. On discute enfin la possibilite de realiser suivant le meme principe un analyseur de densite spectrale dans la bande de frequence 0,01 Hz

  20. The SPH homogeneization method

    International Nuclear Information System (INIS)

    Kavenoky, Alain

    1978-01-01

    The homogeneization of a uniform lattice is a rather well understood topic while difficult problems arise if the lattice becomes irregular. The SPH homogeneization method is an attempt to generate homogeneized cross sections for an irregular lattice. Section 1 summarizes the treatment of an isolated cylindrical cell with an entering surface current (in one velocity theory); Section 2 is devoted to the extension of the SPH method to assembly problems. Finally Section 3 presents the generalisation to general multigroup problems. Numerical results are obtained for a PXR rod bundle assembly in Section 4

  1. Homogeneity of Inorganic Glasses

    DEFF Research Database (Denmark)

    Jensen, Martin; Zhang, L.; Keding, Ralf

    2011-01-01

    Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...

  2. Reflector homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr

    2004-07-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)

  3. Reflector homogenization

    International Nuclear Information System (INIS)

    Sanchez, R.; Ragusa, J.; Santandrea, S.

    2004-01-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P 0 transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP N core calculations. (Author)

  4. Hybrid diffusion–transport spatial homogenization method

    International Nuclear Information System (INIS)

    Kooreman, Gabriel; Rahnema, Farzad

    2014-01-01

    Highlights: • A new hybrid diffusion–transport homogenization method. • An extension of the consistent spatial homogenization (CSH) transport method. • Auxiliary cross section makes homogenized diffusion consistent with heterogeneous diffusion. • An on-the-fly re-homogenization in transport. • The method is faster than fine-mesh transport by 6–8 times. - Abstract: A new hybrid diffusion–transport homogenization method has been developed by extending the consistent spatial homogenization (CSH) transport method to include diffusion theory. As in the CSH method, an “auxiliary cross section” term is introduced into the source term, making the resulting homogenized diffusion equation consistent with its heterogeneous counterpart. The method then utilizes an on-the-fly re-homogenization in transport theory at the assembly level in order to correct for core environment effects on the homogenized cross sections and the auxiliary cross section. The method has been derived in general geometry and tested in a 1-D boiling water reactor (BWR) core benchmark problem for both controlled and uncontrolled configurations. The method has been shown to converge to the reference solution with less than 1.7% average flux error in less than one third the computational time as the CSH method – 6 to 8 times faster than fine-mesh transport

  5. Electro-magnetostatic homogenization of bianisotropic metamaterials

    OpenAIRE

    Fietz, Chris

    2012-01-01

    We apply the method of asymptotic homogenization to metamaterials with microscopically bianisotropic inclusions to calculate a full set of constitutive parameters in the long wavelength limit. Two different implementations of electromagnetic asymptotic homogenization are presented. We test the homogenization procedure on two different metamaterial examples. Finally, the analytical solution for long wavelength homogenization of a one dimensional metamaterial with microscopically bi-isotropic i...

  6. Bilipschitz embedding of homogeneous fractals

    OpenAIRE

    Lü, Fan; Lou, Man-Li; Wen, Zhi-Ying; Xi, Li-Feng

    2014-01-01

    In this paper, we introduce a class of fractals named homogeneous sets based on some measure versions of homogeneity, uniform perfectness and doubling. This fractal class includes all Ahlfors-David regular sets, but most of them are irregular in the sense that they may have different Hausdorff dimensions and packing dimensions. Using Moran sets as main tool, we study the dimensions, bilipschitz embedding and quasi-Lipschitz equivalence of homogeneous fractals.

  7. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    International Nuclear Information System (INIS)

    Moutsopoulos, George

    2013-01-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre–Petrov types and discuss the warped de Sitter spacetime. (paper)

  8. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    Science.gov (United States)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  9. Benchmarking homogenization algorithms for monthly data

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  10. Homogenization of resonant chiral metamaterials

    OpenAIRE

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten; Malureanu, Radu; Lederer, Falk; Lavrinenko, Andrei

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as e.g. propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size a critical density exists above which increasing coupling between neighboring meta-atoms prevails a reasonable homogenization. On the contrary, a dilution in excess will induce features reminiscent to pho...

  11. Modeling and numerical study of transfers in fissured environments; Modelisation et etude numerique des transferts en milieux fissures

    Energy Technology Data Exchange (ETDEWEB)

    Granet, S.

    2000-01-28

    Oil recovery from fractured reservoirs plays a very important role in the petroleum industry. Some of the world most productive oil fields are located in naturally fractured reservoirs. Modelling flow in such a fracture network is a very complex problem. This is conventionally done using a specific idealized model. This model is based on the Warren and Root representation and on a dual porosity, dual permeability approach. A simplified formulation of matrix-fracture fluid transfers uses a pseudo-steady-state transfer equation involving a constant exchange coefficient. Such a choice is one of the main difficulties of this approach. To get a better understanding of the simplifications involved in the dual porosity approach a reference model must be available. To obtain such a fine description, we have developed a new methodology. This technique called 'the fissure element methodology' is based on a specific gridding of the fractured medium. The fissure network is gridded with linear elements coupled with an unstructured triangular grid of matrix. An appropriate finite volume scheme has been developed to provide a good description of the flow. The numerical development of is precisely described. A simulator has been developed using this method. Several simulations have been realised. Comparisons have been done with different dual-porosity dual-permeability models. A reflexion concerning the choice of the exchange coefficient used in the dual porosity model is then proposed. This new tool has permit to have a better understanding of the production mechanisms of a complex fractured reservoir. (author)

  12. Experimental study and numerical simulation of free pulsed jets; Etude experimentale et modelisation numerique des jets libres pulses

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Salwa; Mhiri, Hatem [Ecole Nationale d' Ingenieurs de Monastir, Lab. de Mecanique des Fluides et Thermique, Monastir (Tunisia); Caminat, Ph.; Le Palec, G.; Bournot, Ph. [UNIMECA, 13 - Marseille (France)

    2001-07-01

    A plane pulsed jet flow has been simulated by a finite difference method. Experimental results have also been obtained by laser tomography and particle image velocimetry. The results show that the flow is affected by the pulsation in the vicinity of the nozzle to reach an asymptotic state of a permanent jet. (A.L.B.)

  13. Modelisation numerique d'un actionneur plasma de type decharge a barriere dielectrique par la methode de derive-diffusion

    Science.gov (United States)

    Xing, Jacques

    Dielectric barrier discharge (DBD) plasma actuator is a proposed device for active for control in order to improve the performances of aircraft and turbomachines. Essentially, these actuators are made of two electrodes separated by a layer of dielectric material and convert electricity directly into flow. Because of the high costs associated with experiences in realistic operating conditions, there is a need to develop a robust numerical model that can predict the plasma body force and the effects of various parameters on it. Indeed, this plasma body force can be affected by atmospheric conditions (temperature, pressure, and humidity), velocity of the neutral flow, applied voltage (amplitude, frequency, and waveform), and by the actuator geometry. In that respect, the purpose of this thesis is to implement a plasma model for DBD actuator that has the potential to consider the effects of these various parameters. In DBD actuator modelling, two types of approach are commonly proposed, low-order modelling (or phenomenological) and high-order modelling (or scientific). However a critical analysis, presented in this thesis, showed that phenomenological models are not robust enough to predict the plasma body force without artificial calibration for each specific case. Moreover, there are based on erroneous assumptions. Hence, the selected approach to model the plasma body force is a scientific drift-diffusion model with four chemical species (electrons, positive ions, negative ions, and neutrals). This model was chosen because it gives consistent numerical results comparatively with experimental data. Moreover, this model has great potential to include the effect of temperature, pressure, and humidity on the plasma body force and requires only a reasonable computational time. This model was independently implemented in C++ programming language and validated with several test cases. This model was later used to simulate the effect of the plasma body force on the laminar-turbulent transition on airfoil in order to validate the performance of this model in practical CFD simulation. Numerical results show that this model gives a better prediction of the effect of the plasma on the fluid flow for a practical case in aerospace than a phenomenological model.

  14. Homogenization of neutronic diffusion models

    International Nuclear Information System (INIS)

    Capdebosq, Y.

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  15. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  16. Analytical solutions of time-fractional models for homogeneous Gardner equation and non-homogeneous differential equations

    Directory of Open Access Journals (Sweden)

    Olaniyi Samuel Iyiola

    2014-09-01

    Full Text Available In this paper, we obtain analytical solutions of homogeneous time-fractional Gardner equation and non-homogeneous time-fractional models (including Buck-master equation using q-Homotopy Analysis Method (q-HAM. Our work displays the elegant nature of the application of q-HAM not only to solve homogeneous non-linear fractional differential equations but also to solve the non-homogeneous fractional differential equations. The presence of the auxiliary parameter h helps in an effective way to obtain better approximation comparable to exact solutions. The fraction-factor in this method gives it an edge over other existing analytical methods for non-linear differential equations. Comparisons are made upon the existence of exact solutions to these models. The analysis shows that our analytical solutions converge very rapidly to the exact solutions.

  17. Homogeneous Spaces and Equivariant Embeddings

    CERN Document Server

    Timashev, DA

    2011-01-01

    Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em

  18. Internal homogenization: effective permittivity of a coated sphere.

    Science.gov (United States)

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  19. Homogenization methods for heterogeneous assemblies

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1980-01-01

    The third session of the IAEA Technical Committee Meeting is concerned with the problem of homogenization of heterogeneous assemblies. Six papers will be presented on the theory of homogenization and on practical procedures for deriving homogenized group cross sections and diffusion coefficients. That the problem of finding so-called ''equivalent'' diffusion theory parameters for the use in global reactor calculations is of great practical importance. In spite of this, it is fair to say that the present state of the theory of second homogenization is far from being satisfactory. In fact, there is not even a uniquely accepted approach to the problem of deriving equivalent group diffusion parameters. Common agreement exists only about the fact that the conventional flux-weighting technique provides only a first approximation, which might lead to acceptable results in certain cases, but certainly does not guarantee the basic requirement of conservation of reaction rates

  20. Homogeneous versus heterogeneous zeolite nucleation

    NARCIS (Netherlands)

    Dokter, W.H.; Garderen, van H.F.; Beelen, T.P.M.; Santen, van R.A.; Bras, W.

    1995-01-01

    Aggregates of fractal dimension were found in the intermediate gel phases that organize prior to nucleation and crystallization (shown right) of silicalite from a homogeneous reaction mixture. Small- and wide-angle X-ray scattering studies prove that for zeolites nucleation may be homogeneous or

  1. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  2. A second stage homogenization method

    International Nuclear Information System (INIS)

    Makai, M.

    1981-01-01

    A second homogenization is needed before the diffusion calculation of the core of large reactors. Such a second stage homogenization is outlined here. Our starting point is the Floquet theorem for it states that the diffusion equation for a periodic core always has a particular solution of the form esup(j)sup(B)sup(x) u (x). It is pointed out that the perturbation series expansion of function u can be derived by solving eigenvalue problems and the eigenvalues serve to define homogenized cross sections. With the help of these eigenvalues a homogenized diffusion equation can be derived the solution of which is cos Bx, the macroflux. It is shown that the flux can be expressed as a series of buckling. The leading term in this series is the well known Wigner-Seitz formula. Finally three examples are given: periodic absorption, a cell with an absorber pin in the cell centre, and a cell of three regions. (orig.)

  3. Sewage sludge solubilization by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  4. A literature review on biotic homogenization

    OpenAIRE

    Guangmei Wang; Jingcheng Yang; Chuangdao Jiang; Hongtao Zhao; Zhidong Zhang

    2009-01-01

    Biotic homogenization is the process whereby the genetic, taxonomic and functional similarity of two or more biotas increases over time. As a new research agenda for conservation biogeography, biotic homogenization has become a rapidly emerging topic of interest in ecology and evolution over the past decade. However, research on this topic is rare in China. Herein, we introduce the development of the concept of biotic homogenization, and then discuss methods to quantify its three components (...

  5. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  6. Improving homogeneity by dynamic speed limit systems.

    NARCIS (Netherlands)

    Nes, N. van Brandenberg, S. & Twisk, D.A.M.

    2010-01-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12

  7. Mechanized syringe homogenization of human and animal tissues.

    Science.gov (United States)

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  8. Homogeneity and thermodynamic identities in geometrothermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)

    2017-03-15

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)

  9. Homogeneity of Prototypical Attributes in Soccer Teams

    Directory of Open Access Journals (Sweden)

    Christian Zepp

    2015-09-01

    Full Text Available Research indicates that the homogeneous perception of prototypical attributes influences several intragroup processes. The aim of the present study was to describe the homogeneous perception of the prototype and to identify specific prototypical subcategories, which are perceived as homogeneous within sport teams. The sample consists of N = 20 soccer teams with a total of N = 278 athletes (age M = 23.5 years, SD = 5.0 years. The results reveal that subcategories describing the cohesiveness of the team and motivational attributes are mentioned homogeneously within sport teams. In addition, gender, identification, team size, and the championship ranking significantly correlate with the homogeneous perception of prototypical attributes. The results are discussed on the basis of theoretical and practical implications.

  10. Homogenization theory in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1986-02-01

    The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr

  11. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Layered Fiberconcrete with Non-Homogeneous Fibers Distribution

    OpenAIRE

    Lūsis, V; Krasņikovs, A

    2013-01-01

    The aim of present research is to create fiberconcrete construction with non-homogeneous fibers distribution in it. Traditionally fibers are homogeneously dispersed in a concrete. At the same time in many situations fiberconcretes with homogeneously dispersed fibers are not optimal (majority of added fibers are not participating in a loads bearing process).

  13. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    International Nuclear Information System (INIS)

    Baranyai, L.

    1983-01-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with 198 Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed. (orig.) [de

  14. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    Energy Technology Data Exchange (ETDEWEB)

    Baranyai, L

    1983-12-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with /sup 198/Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed.

  15. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  16. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  17. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    Science.gov (United States)

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Modeling and numerical analysis of non-equilibrium two-phase flows; Modelisation et analyse numerique des ecoulements diphasiques en desequilibre

    Energy Technology Data Exchange (ETDEWEB)

    Rascle, P.; El Amine, K. [Electricite de France (EDF), Direction des Etudes et Recherches, 92 - Clamart (France)

    1997-12-31

    We are interested in the numerical approximation of two-fluid models of nonequilibrium two-phase flows described by six balance equations. We introduce an original splitting technique of the system of equations. This technique is derived in a way such that single phase Riemann solvers may be used: moreover, it allows a straightforward extension to various and detailed exchange source terms. The properties of the fluids are first approached by state equations of ideal gas type and then extended to real fluids. For the construction of numerical schemes , the hyperbolicity of the full system is not necessary. When based on suitable kinetic unwind schemes, the algorithm can compute flow regimes evolving from mixture to single phase flows and vice versa. The whole scheme preserves the physical features of all the variables which remain in the set of physical states. Several stiff numerical tests, such as phase separation and phase transition are displayed in order to highlight the efficiency of the proposed method. The document is a PhD thesis divided in 6 chapters and two annexes. They are entitled: 1. - Introduction (in French), 2. - Two-phase flow, modelling and hyperbolicity (in French), 3. - A numerical method using upwind schemes for the resolution of two-phase flows without exchange terms (in English), 4. - A numerical scheme for one-phase flow of real fluids (in English), 5. - An upwind numerical for non-equilibrium two-phase flows (in English), 6. - The treatment of boundary conditions (in English), A.1. The Perthame scheme (in English) and A.2. The Roe scheme (in English). 136 refs. This document represents a PhD thesis in the speciality Applied Mathematics presented par Khalid El Amine to the Universite Paris 6.

  19. Modelling and numerical simulation of liquid-vapor phase transitions; Modelisation et simulation numerique des transitions de phase liquide-vapeur

    Energy Technology Data Exchange (ETDEWEB)

    Caro, F

    2004-11-15

    This work deals with the modelling and numerical simulation of liquid-vapor phase transition phenomena. The study is divided into two part: first we investigate phase transition phenomena with a Van Der Waals equation of state (non monotonic equation of state), then we adopt an alternative approach with two equations of state. In the first part, we study the classical viscous criteria for selecting weak solutions of the system used when the equation of state is non monotonic. Those criteria do not select physical solutions and therefore we focus a more recent criterion: the visco-capillary criterion. We use this criterion to exactly solve the Riemann problem (which imposes solving an algebraic scalar non linear equation). Unfortunately, this step is quite costly in term of CPU which prevent from using this method as a ground for building Godunov solvers. That is why we propose an alternative approach two equations of state. Using the least action principle, we propose a phase changing two-phase flow model which is based on the second thermodynamic principle. We shall then describe two equilibrium submodels issued from the relaxations processes when instantaneous equilibrium is assumed. Despite the weak hyperbolicity of the last sub-model, we propose stable numerical schemes based on a two-step strategy involving a convective step followed by a relaxation step. We show the ability of the system to simulate vapor bubbles nucleation. (author)

  20. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  1. String pair production in non homogeneous backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, S. [Department of Physics “E. Fermi” University of Pisa, and INFN - Sezione di Pisa,Largo Pontecorvo, 3, Ed. C, 56127 Pisa (Italy); Rabinovici, E. [Racah Institute of Physics, The Hebrew University of Jerusalem,91904 Jerusalem (Israel); Tallarita, G. [Departamento de Ciencias, Facultad de Artes Liberales,Universidad Adolfo Ibáñez, Santiago 7941169 (Chile)

    2016-04-28

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  2. String pair production in non homogeneous backgrounds

    International Nuclear Information System (INIS)

    Bolognesi, S.; Rabinovici, E.; Tallarita, G.

    2016-01-01

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  3. Homogeneous M2 duals

    International Nuclear Information System (INIS)

    Figueroa-O’Farrill, José; Ungureanu, Mara

    2016-01-01

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS 4 ×P 7 , with P riemannian and homogeneous under the action of SO(5), or S 4 ×Q 7 with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  4. Homogeneous M2 duals

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa-O’Farrill, José [School of Mathematics and Maxwell Institute for Mathematical Sciences,The University of Edinburgh,James Clerk Maxwell Building, The King’s Buildings, Peter Guthrie Tait Road,Edinburgh EH9 3FD, Scotland (United Kingdom); Ungureanu, Mara [Humboldt-Universität zu Berlin, Institut für Mathematik,Unter den Linden 6, 10099 Berlin (Germany)

    2016-01-25

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS{sub 4}×P{sup 7}, with P riemannian and homogeneous under the action of SO(5), or S{sup 4}×Q{sup 7} with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  5. Two-Dimensional Homogeneous Fermi Gases

    Science.gov (United States)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  6. Diffusion piecewise homogenization via flux discontinuity ratios

    International Nuclear Information System (INIS)

    Sanchez, Richard; Dante, Giorgio; Zmijarevic, Igor

    2013-01-01

    We analyze piecewise homogenization with flux-weighted cross sections and preservation of averaged currents at the boundary of the homogenized domain. Introduction of a set of flux discontinuity ratios (FDR) that preserve reference interface currents leads to preservation of averaged region reaction rates and fluxes. We consider the class of numerical discretizations with one degree of freedom per volume and per surface and prove that when the homogenization and computing meshes are equal there is a unique solution for the FDRs which exactly preserve interface currents. For diffusion sub-meshing we introduce a Jacobian-Free Newton-Krylov method and for all cases considered obtain an 'exact' numerical solution (eight digits for the interface currents). The homogenization is completed by extending the familiar full assembly homogenization via flux discontinuity factors to the sides of regions laying on the boundary of the piecewise homogenized domain. Finally, for the familiar nodal discretization we numerically find that the FDRs obtained with no sub-mesh (nearly at no cost) can be effectively used for whole-core diffusion calculations with sub-mesh. This is not the case, however, for cell-centered finite differences. (authors)

  7. Homogeneous deuterium exchange using rhenium and platinum chloride catalysts

    International Nuclear Information System (INIS)

    Fawdry, R.M.

    1979-01-01

    Previous studies of homogeneous hydrogen isotope exchange are mostly confined to one catalyst, the tetrachloroplatinite salt. Recent reports have indicated that chloride salts of iridium and rhodium may also be homogeneous exchange catalysts similar to the tetrachloroplatinite, but with much lower activities. Exchange by these homogeneous catalysts is frequently accompanied by metal precipitation with the termination of homogeneous exchange, particularly in the case of alkane exchange. The studies presented in this thesis describe two different approaches to overcome this limitation of homogeneous hydrogen isotope exchange catalysts. The first approach was to improve the stability of an existing homogeneous catalyst and the second was to develop a new homogeneous exchange catalyst which is free of the instability limitation

  8. Slowing down of test particles in a plasma (1961); Ralentissement de particules test dans un plasma (1961)

    Energy Technology Data Exchange (ETDEWEB)

    Belayche, P; Chavy, P; Dupoux, M; Salmon, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1961-07-01

    Numerical solution of the Fokker-Planck equation applied to the slowing down of tritons in a deuterium plasma. After the equations and the boundary conditions have been written, some attention is paid to the numerical tricks used to run the problem on a high speed electronic computer. The numerical results thus obtained are then analyzed and as far as possible, mathematically explained. (authors) [French] Resolution numerique de l'equation de Fokker-Planck appliquee au ralentissement de tritons dans un plasma de deuterium. Apres avoir rappele les equations, les conditions aux limites, l'accent est mis sur les artifices numeriques utilises pour traiter le probleme sur une calculatrice a grande vitesse. Les resultats numeriques obtenus sont ensuite analyses et si possible expliques mathematiquement. En particulier ils peuvent se rattacher a ceux obtenus par application directe de la formule de Spitzer. (auteurs)

  9. The homogeneous geometries of real hyperbolic space

    DEFF Research Database (Denmark)

    Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis

    We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use...... our analysis to show that the moduli space of homogeneous structures on real hyperbolic space has two connected components....

  10. Spinor structures on homogeneous spaces

    International Nuclear Information System (INIS)

    Lyakhovskii, V.D.; Mudrov, A.I.

    1993-01-01

    For multidimensional models of the interaction of elementary particles, the problem of constructing and classifying spinor fields on homogeneous spaces is exceptionally important. An algebraic criterion for the existence of spinor structures on homogeneous spaces used in multidimensional models is developed. A method of explicit construction of spinor structures is proposed, and its effectiveness is demonstrated in examples. The results are of particular importance for harmonic decomposition of spinor fields

  11. Investigations into homogenization of electromagnetic metamaterials

    DEFF Research Database (Denmark)

    Clausen, Niels Christian Jerichau

    This dissertation encompasses homogenization methods, with a special interest into their applications to metamaterial homogenization. The first method studied is the Floquet-Bloch method, that is based on the assumption of a material being infinite periodic. Its field can then be expanded in term...

  12. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  13. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with

  14. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  15. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.

    2015-04-16

    Modeling porous flow in complex media is a challenging problem. Not only is the problem inherently multiscale but, due to high contrast in permeability values, flow velocities may differ greatly throughout the medium. To avoid complicated interface conditions, the Brinkman model is often used for such flows [O. Iliev, R. Lazarov, and J. Willems, Multiscale Model. Simul., 9 (2011), pp. 1350--1372]. Instead of permeability variations and contrast being contained in the geometric media structure, this information is contained in a highly varying and high-contrast coefficient. In this work, we present two main contributions. First, we develop a novel homogenization procedure for the high-contrast Brinkman equations by constructing correctors and carefully estimating the residuals. Understanding the relationship between scales and contrast values is critical to obtaining useful estimates. Therefore, standard convergence-based homogenization techniques [G. A. Chechkin, A. L. Piatniski, and A. S. Shamev, Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point is that the Brinkman equations, in certain scaling regimes, are invariant under homogenization. Unlike in the case of Stokes-to-Darcy homogenization [D. Brown, P. Popov, and Y. Efendiev, GEM Int. J. Geomath., 2 (2011), pp. 281--305, E. Marusic-Paloka and A. Mikelic, Boll. Un. Mat. Ital. A (7), 10 (1996), pp. 661--671], the results presented here under certain velocity regimes yield a Brinkman-to-Brinkman upscaling that allows using a single software platform to compute on both microscales and macroscales. In this paper, we discuss the homogenized Brinkman equations. We derive auxiliary cell problems to build correctors and calculate effective coefficients for certain velocity regimes. Due to the boundary effects, we construct

  16. A personal view on homogenization

    International Nuclear Information System (INIS)

    Tartar, L.

    1987-02-01

    The evolution of some ideas is first described. Under the name homogenization are collected all the mathematical results who help understanding the relations between the microstructure of a material and its macroscopic properties. Homogenization results are given through a critically detailed bibliography. The mathematical models given are systems of partial differential equations, supposed to describe some properties at a scale ε and we want to understand what will happen to the solutions if ε tends to 0

  17. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  18. A generalized model for homogenized reflectors

    International Nuclear Information System (INIS)

    Pogosbekyan, Leonid; Kim, Yeong Il; Kim, Young Jin; Joo, Hyung Kook

    1996-01-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The method of K. Smith can be simulated within framework of new method, while the new method approximates hetero-geneous cell better in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are:improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith's approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b)control blades simulation; (c) mixed UO 2 /MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANBOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions

  19. Dissolution test for homogeneity of mixed oxide fuel pellets

    International Nuclear Information System (INIS)

    Lerch, R.E.

    1979-08-01

    Experiments were performed to determine the relationship between fuel pellet homogeneity and pellet dissolubility. Although, in general, the amount of pellet residue decreased with increased homogeneity, as measured by the pellet figure of merit, the relationship was not absolute. Thus, all pellets with high figure of merit (excellent homogeneity) do not necessarily dissolve completely and all samples that dissolve completely do not necessarily have excellent homogeneity. It was therefore concluded that pellet dissolubility measurements could not be substituted for figure of merit determinations as a measurement of pellet homogeneity. 8 figures, 3 tables

  20. Homogenization patterns of the world's freshwater fish faunas.

    Science.gov (United States)

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-11-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the "Homogocene era" is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes.

  1. Homogen Mur - et udviklingsprojekt

    DEFF Research Database (Denmark)

    Dahl, Torben; Beim, Anne; Sørensen, Peter

    1997-01-01

    Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk.......Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk....

  2. At-tank Low-Activity Feed Homogeneity Analysis Verification

    International Nuclear Information System (INIS)

    DOUGLAS, J.G.

    2000-01-01

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements

  3. Verification of homogenization in fast critical assembly analyses

    International Nuclear Information System (INIS)

    Chiba, Go

    2006-01-01

    In the present paper, homogenization procedures for fast critical assembly analyses are investigated. Errors caused by homogenizations are evaluated by the exact perturbation theory. In order to obtain reference solutions, three-dimensional plate-wise transport calculations are performed. It is found that the angular neutron flux along plate boundaries has a significant peak in the fission source energy range. To treat this angular dependence accurately, the double-Gaussian Chebyshev angular quadrature set with S 24 is applied. It is shown that the difference between the heterogeneous leakage theory and the homogeneous theory is negligible, and that transport cross sections homogenized with neutron flux significantly underestimate neutron leakage. The error in criticality caused by a homogenization is estimated at about 0.1%Δk/kk' in a small fast critical assembly. In addition, the neutron leakage is overestimated by both leakage theories when sodium plates in fuel lattices are voided. (author)

  4. Cosmic homogeneity: a spectroscopic and model-independent measurement

    Science.gov (United States)

    Gonçalves, R. S.; Carvalho, G. C.; Bengaly, C. A. P., Jr.; Carvalho, J. C.; Bernui, A.; Alcaniz, J. S.; Maartens, R.

    2018-03-01

    Cosmology relies on the Cosmological Principle, i.e. the hypothesis that the Universe is homogeneous and isotropic on large scales. This implies in particular that the counts of galaxies should approach a homogeneous scaling with volume at sufficiently large scales. Testing homogeneity is crucial to obtain a correct interpretation of the physical assumptions underlying the current cosmic acceleration and structure formation of the Universe. In this letter, we use the Baryon Oscillation Spectroscopic Survey to make the first spectroscopic and model-independent measurements of the angular homogeneity scale θh. Applying four statistical estimators, we show that the angular distribution of galaxies in the range 0.46 < z < 0.62 is consistent with homogeneity at large scales, and that θh varies with redshift, indicating a smoother Universe in the past. These results are in agreement with the foundations of the standard cosmological paradigm.

  5. A Modified Homogeneous Balance Method and Its Applications

    International Nuclear Information System (INIS)

    Liu Chunping

    2011-01-01

    A modified homogeneous balance method is proposed by improving some key steps in the homogeneous balance method. Bilinear equations of some nonlinear evolution equations are derived by using the modified homogeneous balance method. Generalized Boussinesq equation, KP equation, and mKdV equation are chosen as examples to illustrate our method. This approach is also applicable to a large variety of nonlinear evolution equations. (general)

  6. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.; Kronsbein, Cornelia; Legoll, Fré dé ric

    2015-01-01

    it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison

  7. Homogenization in powder compacts of UO2-PuO2

    International Nuclear Information System (INIS)

    Verma, R.

    1979-01-01

    The homogenization kinetics in mixed UO 2 -PuO 2 compacts have been studied by adopting a concentric core-shell model of diffusion. An equation relating the extent of homogenization expressed in terms of the fraction of UO 2 remaining undissolved and the time of annealing has been derived. From the equation, the periods required at different annealing temperatures to attain a specified level of homogenization have been calculated. These calculated homogenization times have been found to be in fair agreement with the experimentally observed homogenization times. The derived relationship has also been shown to satisfactorily predict homogenization in Cu-Ni powder compacts. (Auth.)

  8. Homogeneity and Entropy

    Science.gov (United States)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  9. Selection of suitable prodrug candidates for in vivo studies via in vitro studies; the correlation of prodrug stability in between cell culture homogenates and human tissue homogenates.

    Science.gov (United States)

    Tsume, Yasuhiro; Amidon, Gordon L

    2012-01-01

    To determine the correlations/discrepancies of drug stabilities between in the homogenates of human culture cells and of human tissues. Amino acid/dipeptide monoester prodrugs of floxuridine were chosen as the model drugs. The stabilities (half-lives) of floxuridine prodrugs in human tissues (pancreas, liver, and small intestine) homogenates were obtained and compared with ones in cell culture homogenates (AcPC-1, Capan-2, and Caco-2 cells) as well as human liver microsomes. The correlations of prodrug stability in human small bowel tissue homogenate vs. Caco-2 cell homogenate, human liver tissue homogenate vs. human liver microsomes, and human pancreatic tissue homogenate vs. pancreatic cell, AsPC-1 and Capan-2, homogenates were examined. The stabilities of floxuridine prodrugs in human small bowel homogenate exhibited the great correlation to ones in Caco-2 cell homogenate (slope = 1.0-1.3, r2 = 0.79-0.98). The stability of those prodrugs in human pancreas tissue homogenate also exhibited the good correlations to ones in AsPC-1 and Capan-2 cells homogenates (slope = 0.5-0.8, r2 = 0.58-0.79). However, the correlations of prodrug stabilities between in human liver tissue homogenates and in human liver microsomes were weaker than others (slope = 1.3-1.9, r2 = 0.07-0.24). The correlations of drug stabilities in cultured cell homogenates and in human tissue homogenates were compared. Those results exhibited wide range of correlations between in cell homogenate and in human tissue homogenate (r2 = 0.07 - 0.98). Those in vitro studies in cell homogenates would be good tools to predict drug stabilities in vivo and to select drug candidates for further developments. In the series of experiments, 5'-O-D-valyl-floxuridine and 5'-O-L-phenylalanyl-L-tyrosyl-floxuridine would be selected as candidates of oral drug targeting delivery for cancer chemotherapy due to their relatively good stabilities compared to other tested prodrugs.

  10. Toward whole-core neutron transport without spatial homogenization

    International Nuclear Information System (INIS)

    Lewis, E. E.

    2009-01-01

    Full text of publication follows: A long-term goal of computational reactor physics is the deterministic analysis of power reactor core neutronics without incurring significant discretization errors in the energy, spatial or angular variables. In principle, given large enough parallel configurations with unlimited CPU time and memory, this goal could be achieved using existing three-dimensional neutron transport codes. In practice, however, solving the Boltzmann equation for neutrons over the six-dimensional phase space is made intractable by the nature of neutron cross-sections and the complexity and size of power reactor cores. Tens of thousands of energy groups would be required for faithful cross section representation. Likewise, the numerous material interfaces present in power reactor lattices require exceedingly fine spatial mesh structures; these ubiquitous interfaces preclude effective implementation of adaptive grid, mesh-less methods and related techniques that have been applied so successfully in other areas of engineering science. These challenges notwithstanding, substantial progress continues in the pursuit for more robust deterministic methods for whole-core neutronics analysis. This paper examines the progress over roughly the last decade, emphasizing the space-angle variables and the quest to eliminate errors attributable to spatial homogenization. As prolog we briefly assess 1990's methods used in light water reactor analysis and review the lessons learned from the C5G7 benchmark exercises which were originated in 1999 to appraise the ability of transport codes to perform core calculations without homogenization. We proceed by examining progress over the last decade much of which falls into three areas. These may be broadly characterized as reduced homogenization, dynamic homogenization and planar-axial synthesis. In the first, homogenization in three-dimensional calculations is reduced from the fuel assembly to the pin-cell level. In the second

  11. Homogenization models for thin rigid structured surfaces and films.

    Science.gov (United States)

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  12. Sewage sludge disintegration by high-pressure homogenization: a sludge disintegration model.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Ma, Boqiang; Wu, Hao; Zhang, Sheng; Xu, Xin

    2012-01-01

    High-pressure homogenization (HPH) technology was applied as a pretreatment to disintegrate sewage sludge. The effects of homogenization pressure, homogenization cycle number, and total solid content on sludge disintegration were investigated. The sludge disintegration degree (DD(COD)), protein concentration, and polysaccharide concentration increased with the increase of homogenization pressure and homogenization cycle number, and decreased with the increase of sludge total solid (TS) content. The maximum DD(COD) of 43.94% was achieved at 80 MPa with four homogenization cycles for a 9.58 g/L TS sludge sample. A HPH sludge disintegration model of DD(COD) = kNaPb was established by multivariable linear regression to quantify the effects of homogenization parameters. The homogenization cycle exponent a and homogenization pressure exponent b were 0.4763 and 0.7324 respectively, showing that the effect of homogenization pressure (P) was more significant than that of homogenization cycle number (N). The value of the rate constant k decreased with the increase of sludge total solid content. The specific energy consumption increased with the increment of sludge disintegration efficiency. Lower specific energy consumption was required for higher total solid content sludge.

  13. A non-asymptotic homogenization theory for periodic electromagnetic structures.

    Science.gov (United States)

    Tsukerman, Igor; Markel, Vadim A

    2014-08-08

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions.

  14. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  15. Persymmetric Adaptive Detectors of Subspace Signals in Homogeneous and Partially Homogeneous Clutter

    Directory of Open Access Journals (Sweden)

    Ding Hao

    2015-08-01

    Full Text Available In the field of adaptive radar detection, an effective strategy to improve the detection performance is to exploit the structural information of the covariance matrix, especially in the case of insufficient reference cells. Thus, in this study, the problem of detecting multidimensional subspace signals is discussed by considering the persymmetric structure of the clutter covariance matrix, which implies that the covariance matrix is persymmetric about its cross diagonal. Persymmetric adaptive detectors are derived on the basis of the one-step principle as well as the two-step Generalized Likelihood Ratio Test (GLRT in homogeneous and partially homogeneous clutter. The proposed detectors consider the structural information of the covariance matrix at the design stage. Simulation results suggest performance improvement compared with existing detectors when reference cells are insufficient. Moreover, the detection performance is assessed with respect to the effects of the covariance matrix, signal subspace dimension, and mismatched performance of signal subspace as well as signal fluctuations.

  16. Layout optimization using the homogenization method

    Science.gov (United States)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  17. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2003-12-01

    Full Text Available Abstract Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium.

  18. Spatial homogenization method based on the inverse problem

    International Nuclear Information System (INIS)

    Tóta, Ádám; Makai, Mihály

    2015-01-01

    Highlights: • We derive a spatial homogenization method in slab and cylindrical geometries. • The fluxes and the currents on the boundary are preserved. • The reaction rates and the integral of the fluxes are preserved. • We present verification computations utilizing two- and four-energy groups. - Abstract: We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region’s multi-group cross sections; providing that the fluxes, the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved. We consider one-dimensional geometries: a symmetric slab and a homogeneous cylinder. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined concerning the current and the flux integral. The first one derives the boundary currents from the boundary fluxes, while the second one derives the flux integrals from the boundary fluxes. Further RMs can be defined that connects reaction rates to the boundary fluxes. Assuming that these matrices are known, we present formulae that reconstruct the multi-group diffusion cross-section matrix, the diffusion coefficients and the reaction cross sections in case of one-dimensional (1D) homogeneous regions. We apply these formulae to 1D heterogeneous regions and thus obtain a homogenization method. This method produces such an equivalent homogeneous material, that the fluxes and the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved for any boundary fluxes. We carry out the exact derivations in 1D slab and cylindrical geometries. Verification computations for the presented homogenization method were performed using two- and four-group material cross sections, both in a slab and in a cylindrical geometry

  19. Homogeneous bilateral block shifts

    Indian Academy of Sciences (India)

    Douglas class were classified in [3]; they are unilateral block shifts of arbitrary block size (i.e. dim H(n) can be anything). However, no examples of irreducible homogeneous bilateral block shifts of block size larger than 1 were known until now.

  20. Homogenization patterns of the world’s freshwater fish faunas

    Science.gov (United States)

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  1. Observational homogeneity of the Universe

    International Nuclear Information System (INIS)

    Bonnor, W.B.; Ellis, G.F.R.

    1986-01-01

    A new approach to observational homogeneity is presented. The observation that stars and galaxies in distant regions appear similar to those nearby may be taken to imply that matter has had a similar thermodynamic history in widely separated parts of the Universe (the Postulate of Uniform Thermal Histories, or PUTH). The supposition is now made that similar thermodynamic histories imply similar dynamical histories. Then the distant apparent similarity is evidence for spatial homogeneity of the Universe. General Relativity is used to test this idea, taking a perfect fluid model and implementing PUTH by the condition that the density and entropy per baryon shall be the same function of the proper time along all galaxy world-lines. (author)

  2. Cross section homogenization analysis for a simplified Candu reactor

    International Nuclear Information System (INIS)

    Pounders, Justin; Rahnema, Farzad; Mosher, Scott; Serghiuta, Dumitru; Turinsky, Paul; Sarsour, Hisham

    2008-01-01

    The effect of using zero current (infinite medium) boundary conditions to generate bundle homogenized cross sections for a stylized half-core Candu reactor problem is examined. Homogenized cross section from infinite medium lattice calculations are compared with cross sections homogenized using the exact flux from the reference core environment. The impact of these cross section differences is quantified by generating nodal diffusion theory solutions with both sets of cross sections. It is shown that the infinite medium spatial approximation is not negligible, and that ignoring the impact of the heterogeneous core environment on cross section homogenization leads to increased errors, particularly near control elements and the core periphery. (authors)

  3. Hydrogen Production by Homogeneous Catalysis: Alcohol Acceptorless Dehydrogenation

    DEFF Research Database (Denmark)

    Nielsen, Martin

    2015-01-01

    in hydrogen production from biomass using homogeneous catalysis. Homogeneous catalysis has the advance of generally performing transformations at much milder conditions than traditional heterogeneous catalysis, and hence it constitutes a promising tool for future applications for a sustainable energy sector...

  4. Dynamics of homogeneous nucleation

    DEFF Research Database (Denmark)

    Toxværd, Søren

    2015-01-01

    The classical nucleation theory for homogeneous nucleation is formulated as a theory for a density fluctuation in a supersaturated gas at a given temperature. But molecular dynamics simulations reveal that it is small cold clusters which initiates the nucleation. The temperature in the nucleating...

  5. A convenient procedure for magnetic field homogeneity evaluation

    International Nuclear Information System (INIS)

    Teles, J; Garrido, C E; Tannus, A

    2004-01-01

    In many areas of research that utilize magnetic fields in their studies, it is important to obtain fields with a spatial distribution as homogeneous as possible. A procedure usually utilized to evaluate and to optimize field homogeneity is the expansion of the measured field in spherical harmonic components. In addition to the methods proposed in the literature, we present a more convenient procedure for evaluation of field homogeneity inside a spherical volume. The procedure uses the orthogonality property of the spherical harmonics to find the field variance. It is shown that the total field variance is equal to the sum of the individual variances of each field component in the spherical harmonic expansion. Besides the advantages of the linear behaviour of the individual variances, there is the fact that the field variance and standard deviation are the best parameters to achieve global homogeneity field information

  6. A new concept of equivalent homogenization method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Pogoskekyan, Leonid; Kim, Young Il; Ju, Hyung Kook; Chang, Moon Hee [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The offered concept covers both those of K. Koebke and K. Smith; both of them can be simulated within framework of new concept. Also, the offered concept covers Siemens KWU approach for baffle/reflector simulation, where the equivalent homogenized reflector XS are derived from the conservation of response matrix at the interface in 1D simi-infinite slab geometry. The IM and XS of new concept satisfy the same assumption about response matrix conservation in 1D semi-infinite slab geometry. It is expected that the new concept provides more accurate approximation of heterogeneous cell, especially in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are: improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith`s approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b) control blades simulation; (c) mixed UO{sub 2}/MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANDOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions. 9 figs., 7 refs. (Author).

  7. Homogenization of aligned “fuzzy fiber” composites

    KAUST Repository

    Chatzigeorgiou, George

    2011-09-01

    The aim of this work is to study composites in which carbon fibers coated with radially aligned carbon nanotubes are embedded in a matrix. The effective properties of these composites are identified using the asymptotic expansion homogenization method in two steps. Homogenization is performed in different coordinate systems, the cylindrical and the Cartesian, and a numerical example are presented. © 2011 Elsevier Ltd. All rights reserved.

  8. Structural changes in heat resisting high nickel alloys during homogenization

    International Nuclear Information System (INIS)

    Kleshchev, A.S.; Korneeva, N.N.; Yurina, O.M.; Guzej, L.S.

    1981-01-01

    Effect of homogenization on the structure and technological plasticity of the KhN73MBTYu and KhN62BMKTYu alloys during treatment with pressure is investigated taking into account peculiarities if the phase composition. It is shown that homogenization of the KhN73MBTYu and KhN62BMKTYu alloys increases the technological plasticity. Homogenization efficiency is conditioned by the change of the grain boundaries and carbide morphology as well as by homogeneous distribution of the large γ'-phase [ru

  9. Homogeneous group, research, institution

    Directory of Open Access Journals (Sweden)

    Francesca Natascia Vasta

    2014-09-01

    Full Text Available The work outlines the complex connection among empiric research, therapeutic programs and host institution. It is considered the current research state in Italy. Italian research field is analyzed and critic data are outlined: lack of results regarding both the therapeutic processes and the effectiveness of eating disorders group analytic treatment. The work investigates on an eating disorders homogeneous group, led into an eating disorder outpatient service. First we present the methodological steps the research is based on including the strong connection among theory and clinical tools. Secondly clinical tools are described and the results commented. Finally, our results suggest the necessity of validating some more specifical hypothesis: verifying the relationship between clinical improvement (sense of exclusion and painful emotions reduction and specific group therapeutic processes; verifying the relationship between depressive feelings, relapses and transition trough a more differentiated groupal field.Keywords: Homogeneous group; Eating disorders; Institutional field; Therapeutic outcome

  10. Is it possible to homogenize resonant chiral metamaterials ?

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is very important as it makes possible description in terms of effective parameters. In this contribution we consider the homogenization of chiral metamaterials. We show that for some metamaterials there is an optimal meta-atom size which depends on the coupling...

  11. The Digital Autofluoroscope; L'Autofluoroscope Numerique; Tsifrovoj avtofluoroskop; Autofluoroscopio Numerico

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M. A. [Roswell Park Memorial Institute, Buffalo, NY (United States)

    1964-10-15

    cinephotographic techniques are now used for the visualization and quantitation of the passage of I{sup 131}-labelled Hippuran through the kidneys and Ba{sup 137m} through the chambers of the heart. (author) [French] L' autofluoroscope est un appareil fixe destine a donner une representation graphique de la repartition des emetteurs gamma dans le corps humain. Cet instrument possede les caracteristiques principales des scintigraphes modernes: collimation donnant une bonne reponse en profondeur et une resolution suffisante, grande efficacite et fort contraste. Le detecteur se compose de 300 cristaux de Nal(Tl) de 5 cm d'epaisseur et de 1 cm de diametre, disposes en 15 rangees comportant chacune 20 cristaux; l'ensemble forme un rectangle de 24 cm de long sur 15 cm de large. Chacun des 300 cristaux est couple a deux conduits optiques en plexiglas; les 20 conduits correspondant aux cristaux d'une meme rangee conduisent a une cellule photoelectrique et les 15 conduits correspondant aux cristaux d'une meme colonne conduisent a une autre c ellule photoelectrique. Les impulsions produites simultanement dans deux quelconques des 35 cellules identifient le cristal qui a ete le siege d'une interaction. Le signal de position donne par la batterie de cellules photoelectriques est independant de l'amplitude d'impulsion. Des circuits a anticoincidence eliminent les impulsions simultanees dues a la diffusion Compton suivie de l'absorption du photon diffuse par le cristal voisin. Etant donne leur caractere numerique, les donnees fournies par l'ensemble des conduits optiques peuvent etre facilement stockees dans un tore magnetique et ensuite lues continuellement, sans etre detruites, sur un enregistrement echelle grandeur, TCR ou numerique, aux fins d'analyse quantitative. Le premier avantage de 1*autofluoroscope sur le scintigraphe reside dans la diminution considerable du temps necessaire pour proceder a une exploration. Avec la meme dose de radioisotopes, on localise des tumeurs du cerveau et du

  12. Matrix-dependent multigrid-homogenization for diffusion problems

    Energy Technology Data Exchange (ETDEWEB)

    Knapek, S. [Institut fuer Informatik tu Muenchen (Germany)

    1996-12-31

    We present a method to approximately determine the effective diffusion coefficient on the coarse scale level of problems with strongly varying or discontinuous diffusion coefficients. It is based on techniques used also in multigrid, like Dendy`s matrix-dependent prolongations and the construction of coarse grid operators by means of the Galerkin approximation. In numerical experiments, we compare our multigrid-homogenization method with homogenization, renormalization and averaging approaches.

  13. Surface integral formulation of Maxwell's equations for simulation of non-destructive testing by eddy currents. Preliminary study on the implementation of the fast multipole method; Formulation integrale surfacique des equations de Maxwell pour la simulation de controles non destructifs par courant de Foucault. Etude preliminaire a la mise en oeuvre de la methode multipole rapide.

    Energy Technology Data Exchange (ETDEWEB)

    Lim, T.

    2011-04-28

    numeriquement un controle non destructif par courants de Foucault (CND-CF), la reponse du capteur peut etre modelisee via une approche semi-analytique par integrales de volume. Plus rapide que la methode des elements finis, cette approche est cependant limitee a l'etude de pieces planes ou cylindriques (sans prise en compte des effets de bords) du fait de la complexite de l'expression de la dyade de Green pour des configurations plus generales. Or, il existe une forte demande industrielle pour etendre les capacites de la modelisation CF a des configurations complexes (plaques deformees, bords de piece...). Nous avons donc ete amenes a formuler differemment le probleme electromagnetique, en nous fixant comme objectif de conserver une approche semi-analytique. La formulation integrale surfacique (SIE) permet d'exprimer le probleme volumique en un probleme de transmission equivalent a l'interface (2D) entre sous-domaines homogenes. Ce probleme est ramene a la resolution d'un systeme lineaire (par la methode des moments) dont le nombre d'inconnues est reduit du fait du caractere surfacique du maillage. Des lors, ce systeme peut etre resolu par un solveur direct pour de petites configurations. Cela nous a permis de traiter plusieurs seconds membres (ie. differentes positions de capteurs) pour une seule inversion de la matrice d'impedance. Les resultats numeriques obtenus au moyen de cette formulation concernent des plaques avec la prise en compte des effets de bords tels que l'arete et le coin. Ils sont en accord avec des resultats obtenus par la methode des elements finis. Pour des configurations de grandes tailles, nous avons mene une etude preliminaire a l'adaptation d'une methode d'acceleration du produit matrice-vecteur intervenant dans un solveur iteratif (methode multipole rapide, ou FMM) afin de definir les conditions dans lesquelles le calcul FMM fonctionne correctement (precision, convergence...) dans le contexte CND

  14. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    Science.gov (United States)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  15. Homogenization of Stokes and Navier-Stokes equations

    International Nuclear Information System (INIS)

    Allaire, G.

    1990-04-01

    This thesis is devoted to homogenization of Stokes and Navier-Stokes equations with a Dirichlet boundary condition in a domain containing many tiny obstacles. Tipycally those obstacles are distributed at the modes of a periodic lattice with same small period in each axe's direction, and their size is always asymptotically smaller than the lattice's step. With the help of the energy method, and thanks to a suitable pressure's extension, we prove the convergence of the homogenization process when the lattice's step tends to zero (and thus the number of obstacles tends to infinity). For a so-called critical size of the obstacles, the homogenized problem turns out to be a Brinkman's law (i.e. Stokes or Navier-Stokes equation plus a linear zero-order term for the velocity in the momentum equation). For obstacles which have a size smaller than the critical one, the limit problem reduces to the initial Stokes or Navier-Stokes equations, while for larger sizes the homogenized problem a Darcy's law. Furthermore, those results have been extended to the case of obstacles included in a hyperplane, and we establish a simple model of fluid flows through grids, which is based on a special form of Brinkman's law [fr

  16. Homogenized thermal conduction model for particulate foods

    OpenAIRE

    Chinesta , Francisco; Torres , Rafael; Ramón , Antonio; Rodrigo , Mari Carmen; Rodrigo , Miguel

    2002-01-01

    International audience; This paper deals with the definition of an equivalent thermal conductivity for particulate foods. An homogenized thermal model is used to asses the effect of particulate spatial distribution and differences in thermal conductivities. We prove that the spatial average of the conductivity can be used in an homogenized heat transfer model if the conductivity differences among the food components are not very large, usually the highest conductivity ratio between the foods ...

  17. Soy Protein Isolate-Phosphatidylcholine Nanoemulsions Prepared Using High-Pressure Homogenization.

    Science.gov (United States)

    Li, Yang; Wu, Chang-Ling; Liu, Jun; Zhu, Ying; Zhang, Xiao-Yuan; Jiang, Lian-Zhou; Qi, Bao-Kun; Zhang, Xiao-Nan; Wang, Zhong-Jiang; Teng, Fei

    2018-05-07

    The nanoemulsions of soy protein isolate-phosphatidylcholine (SPI-PC) with different emulsion conditions were studied. Homogenization pressure and homogenization cycle times were varied, along with SPI and PC concentration. Evaluations included turbidity, particle size, ζ-potential, particle distribution index, and turbiscan stability index (TSI). The nanoemulsions had the best stability when SPI was at 1.5%, PC was at 0.22%, the homogenization pressure was 100 MPa and homogenization was performed 4 times. The average particle size of the SPI-PC nanoemulsions was 217 nm, the TSI was 3.02 and the emulsification yield was 93.4% of nanoemulsions.

  18. Pattern and process of biotic homogenization in the New Pangaea.

    Science.gov (United States)

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-07

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  19. Higher-order asymptotic homogenization of periodic materials with low scale separation

    NARCIS (Netherlands)

    Ameen, M.M.; Peerlings, R.H.J.; Geers, M.G.D

    2016-01-01

    In this work, we investigate the limits of classical homogenization theories pertaining to homogenization of periodic linear elastic composite materials at low scale separations and demonstrate the effectiveness of higher-order periodic homogenization in alleviating this limitation. Classical

  20. Sewage sludge disintegration by combined treatment of alkaline+high pressure homogenization.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Zhang, Guangming; Ma, Weifang; Wu, Hao; Ma, Boqiang

    2012-11-01

    Alkaline pretreatment combined with high pressure homogenization (HPH) was applied to promote sewage sludge disintegration. For sewage sludge with a total solid content of 1.82%, sludge disintegration degree (DD(COD)) with combined treatment was higher than the sum of DD(COD) with single alkaline and single HPH treatment. NaOH dosage ⩽0.04mol/L, homogenization pressure ⩽60MPa and a single homogenization cycle were the suitable conditions for combined sludge treatment. The combined sludge treatment showed a maximum DD(COD) of 59.26%. By regression analysis, the combined sludge disintegration model was established as 11-DD(COD)=0.713C(0.334)P(0.234)N(0.119), showing that the effect of operating parameters on sludge disintegration followed the order: NaOH dosage>homogenization pressure>number of homogenization cycle. The energy efficiency with combined sludge treatment significantly increased compared with that with single HPH treatment, and the high energy efficiency was achieved at low homogenization pressure with a single homogenization cycle. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Central Andean temperature and precipitation measurements and its homogenization

    Science.gov (United States)

    Hunziker, Stefan; Gubler, Stefanie

    2015-04-01

    Observation of climatological parameters and the homogenization of these time series have a well-established history in western countries. This is not the case for many other countries, such as Bolivia and Peru. In Bolivia and Peru, the organization of measurements, quality of measurement equipment, equipment maintenance, training of staff and data management are fundamentally different compared to the western standard. The data needs special attention, because many problems are not detected by standard quality control procedures. Information about the weather stations, best achieved by station visits, is very beneficial. If the cause of the problem is known, some of the data may be corrected. In this study, cases of typical problems and measurement errors will be demonstrated. Much of research on homogenization techniques (up to subdaily scale) has been completed in recent years. However, data sets of the quality of western station networks have been used, and little is known about the performance of homogenization methods on data sets from countries such as Bolivia and Peru. HOMER (HOMogenizaton softwarE in R) is one of the most recent and widely used homogenization softwares. Its performance is tested on Peruvian-like data that has been sourced from Swiss stations (similar station density and metadata availability). The Swiss station network is a suitable test bed, because climate gradients are strong and the terrain is complex, as is also found in the Central Andes. On the other hand, the Swiss station network is dense, and long time series and extensive metadata are available. By subsampling the station network and omitting the metadata, the conditions of a Peruvian test region are mimicked. Results are compared to a dataset homogenized by THOMAS (Tool for Homogenization of Monthly Data Series), the homogenization tool used by MeteoSwiss.

  2. Land-use intensification causes multitrophic homogenization of grassland communities.

    Science.gov (United States)

    Gossner, Martin M; Lewinsohn, Thomas M; Kahl, Tiemo; Grassein, Fabrice; Boch, Steffen; Prati, Daniel; Birkhofer, Klaus; Renner, Swen C; Sikorski, Johannes; Wubet, Tesfaye; Arndt, Hartmut; Baumgartner, Vanessa; Blaser, Stefan; Blüthgen, Nico; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Jorge, Leonardo Ré; Jung, Kirsten; Keyel, Alexander C; Klein, Alexandra-Maria; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Müller, Jörg; Overmann, Jörg; Pašalić, Esther; Penone, Caterina; Perović, David J; Purschke, Oliver; Schall, Peter; Socher, Stephanie A; Sonnemann, Ilja; Tschapka, Marco; Tscharntke, Teja; Türke, Manfred; Venter, Paul Christiaan; Weiner, Christiane N; Werner, Michael; Wolters, Volkmar; Wurst, Susanne; Westphal, Catrin; Fischer, Markus; Weisser, Wolfgang W; Allan, Eric

    2016-12-08

    Land-use intensification is a major driver of biodiversity loss. Alongside reductions in local species diversity, biotic homogenization at larger spatial scales is of great concern for conservation. Biotic homogenization means a decrease in β-diversity (the compositional dissimilarity between sites). Most studies have investigated losses in local (α)-diversity and neglected biodiversity loss at larger spatial scales. Studies addressing β-diversity have focused on single or a few organism groups (for example, ref. 4), and it is thus unknown whether land-use intensification homogenizes communities at different trophic levels, above- and belowground. Here we show that even moderate increases in local land-use intensity (LUI) cause biotic homogenization across microbial, plant and animal groups, both above- and belowground, and that this is largely independent of changes in α-diversity. We analysed a unique grassland biodiversity dataset, with abundances of more than 4,000 species belonging to 12 trophic groups. LUI, and, in particular, high mowing intensity, had consistent effects on β-diversity across groups, causing a homogenization of soil microbial, fungal pathogen, plant and arthropod communities. These effects were nonlinear and the strongest declines in β-diversity occurred in the transition from extensively managed to intermediate intensity grassland. LUI tended to reduce local α-diversity in aboveground groups, whereas the α-diversity increased in belowground groups. Correlations between the β-diversity of different groups, particularly between plants and their consumers, became weaker at high LUI. This suggests a loss of specialist species and is further evidence for biotic homogenization. The consistently negative effects of LUI on landscape-scale biodiversity underscore the high value of extensively managed grasslands for conserving multitrophic biodiversity and ecosystem service provision. Indeed, biotic homogenization rather than local diversity

  3. Applications of a systematic homogenization theory for nodal diffusion methods

    International Nuclear Information System (INIS)

    Zhang, Hong-bin; Dorning, J.J.

    1992-01-01

    The authors recently have developed a self-consistent and systematic lattice cell and fuel bundle homogenization theory based on a multiple spatial scales asymptotic expansion of the transport equation in the ratio of the mean free path to the reactor characteristics dimension for use with nodal diffusion methods. The mathematical development leads naturally to self-consistent analytical expressions for homogenized diffusion coefficients and cross sections and flux discontinuity factors to be used in nodal diffusion calculations. The expressions for the homogenized nuclear parameters that follow from the systematic homogenization theory (SHT) are different from those for the traditional flux and volume-weighted (FVW) parameters. The calculations summarized here show that the systematic homogenization theory developed recently for nodal diffusion methods yields accurate values for k eff and assembly powers even when compared with the results of a fine mesh transport calculation. Thus, it provides a practical alternative to equivalence theory and GET (Ref. 3) and to simplified equivalence theory, which requires auxiliary fine-mesh calculations for assemblies embedded in a typical environment to determine the discontinuity factors and the equivalent diffusion coefficient for a homogenized assembly

  4. Spray structure as generated under homogeneous flash boiling nucleation regime

    International Nuclear Information System (INIS)

    Levy, M.; Levy, Y.; Sher, E.

    2014-01-01

    We show the effect of the initial pressure and temperature on the spatial distribution of droplets size and their velocity profile inside a spray cloud that is generated by a flash boiling mechanism under homogeneous nucleation regime. We used TSI's Phase Doppler Particle Analyzer (PDPA) to characterize the spray. We conclude that the homogeneous nucleation process is strongly affected by the initial liquid temperature while the initial pressure has only a minor effect. The spray shape is not affected by temperature or pressure under homogeneous nucleation regime. We noted that the only visible effect is in the spray opacity. Finally, homogeneous nucleation may be easily achieved by using a simple atomizer construction, and thus is potentially suitable for fuel injection systems in combustors and engines. - Highlights: • We study the characteristics of a spray that is generated by a flash boiling process. • In this study, the flash boiling process occurs under homogeneous nucleation regime. • We used Phase Doppler Particle Analyzer (PDPA) to characterize the spray. • The SMD has been found to be strongly affected by the initial liquid temperature. • Homogeneous nucleation may be easily achieved by using a simple atomizer unit

  5. Qualitative analysis of homogeneous universes

    International Nuclear Information System (INIS)

    Novello, M.; Araujo, R.A.

    1980-01-01

    The qualitative behaviour of cosmological models is investigated in two cases: Homogeneous and isotropic Universes containing viscous fluids in a stokesian non-linear regime; Rotating expanding universes in a state which matter is off thermal equilibrium. (Author) [pt

  6. An iterative homogenization technique that preserves assembly core exchanges

    International Nuclear Information System (INIS)

    Mondot, Ph.; Sanchez, R.

    2003-01-01

    A new interactive homogenization procedure for reactor core calculations is proposed that requires iterative transport assembly and diffusion core calculations. At each iteration the transport solution of every assembly type is used to produce homogenized cross sections for the core calculation. The converged solution gives assembly fine multigroup transport fluxes that preserve macro-group assembly exchanges in the core. This homogenization avoids the periodic lattice-leakage model approximation and gives detailed assembly transport fluxes without need of an approximated flux reconstruction. Preliminary results are given for a one-dimensional core model. (authors)

  7. Metallographic Index-Based Quantification of the Homogenization State in Extrudable Aluminum Alloys

    Directory of Open Access Journals (Sweden)

    Panagiota I. Sarafoglou

    2016-05-01

    Full Text Available Extrudability of aluminum alloys of the 6xxx series is highly dependent on the microstructure of the homogenized billets. It is therefore very important to characterize quantitatively the state of homogenization of the as-cast billets. The quantification of the homogenization state was based on the measurement of specific microstructural indices, which describe the size and shape of the intermetallics and indicate the state of homogenization. The indices evaluated were the following: aspect ratio (AR, which is the ratio of the maximum to the minimum diameter of the particles, feret (F, which is the maximum caliper length, and circularity (C, which is a measure of how closely a particle resembles a circle in a 2D metallographic section. The method included extensive metallographic work and the measurement of a large number of particles, including a statistical analysis, in order to investigate the effect of homogenization time. Among the indices examined, the circularity index exhibited the most consistent variation with homogenization time. The lowest value of the circularity index coincided with the metallographic observation for necklace formation. Shorter homogenization times resulted in intermediate homogenization stages involving rounding of edges or particle pinching. The results indicated that the index-based quantification of the homogenization state could provide a credible method for the selection of homogenization process parameters towards enhanced extrudability.

  8. Effect of high-pressure homogenization on different matrices of food supplements.

    Science.gov (United States)

    Martínez-Sánchez, Ascensión; Tarazona-Díaz, Martha Patricia; García-González, Antonio; Gómez, Perla A; Aguayo, Encarna

    2016-12-01

    There is a growing demand for food supplements containing high amounts of vitamins, phenolic compounds and mineral content that provide health benefits. Those functional compounds have different solubility properties, and the maintenance of their compounds and the guarantee of their homogenic properties need the application of novel technologies. The quality of different drinkable functional foods after thermal processing (0.1 MPa) or high-pressure homogenization under two different conditions (80 MPa, 33 ℃ and 120 MPa, 43 ℃) was studied. Physicochemical characteristics and sensory qualities were evaluated throughout the six months of accelerated storage at 40 ℃ and 75% relative humidity (RH). Aroma and color were better maintained in high-pressure homogenization-treated samples than the thermally treated ones, which contributed significantly to extending their shelf life. The small particle size obtained after high-pressure homogenization treatments caused differences in turbidity and viscosity with respect to heat-treated samples. The use of high-pressure homogenization, more specifically, 120 MPa, provided active ingredient homogeneity to ensure uniform content in functional food supplements. Although the effect of high-pressure homogenization can be affected by the food matrix, high-pressure homogenization can be implemented as an alternative to conventional heat treatments in a commercial setting within the functional food supplement or pharmaceutical industry. © The Author(s) 2016.

  9. Rapid biotic homogenization of marine fish assemblages

    Science.gov (United States)

    Magurran, Anne E.; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J.; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  10. Spontaneous compactification to homogeneous spaces

    International Nuclear Information System (INIS)

    Mourao, J.M.

    1988-01-01

    The spontaneous compactification of extra dimensions to compact homogeneous spaces is studied. The methods developed within the framework of coset space dimensional reduction scheme and the most general form of invariant metrics are used to find solutions of spontaneous compactification equations

  11. On integral representation, relaxation and homogenization for unbounded functionals

    International Nuclear Information System (INIS)

    Carbone, L.; De Arcangelis, R.

    1997-01-01

    A theory of integral representation, relaxation and homogenization for some types of variational functionals taking extended real values and possibly being not finite also on large classes of regular functions is presented. Some applications to gradient constrained relaxation and homogenization problems are given

  12. Flows and chemical reactions in homogeneous mixtures

    CERN Document Server

    Prud'homme, Roger

    2013-01-01

    Flows with chemical reactions can occur in various fields such as combustion, process engineering, aeronautics, the atmospheric environment and aquatics. The examples of application chosen in this book mainly concern homogeneous reactive mixtures that can occur in propellers within the fields of process engineering and combustion: - propagation of sound and monodimensional flows in nozzles, which may include disequilibria of the internal modes of the energy of molecules; - ideal chemical reactors, stabilization of their steady operation points in the homogeneous case of a perfect mixture and c

  13. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  14. Soluble Molecularly Imprinted Nanorods for Homogeneous Molecular Recognition

    Directory of Open Access Journals (Sweden)

    Rongning Liang

    2018-03-01

    Full Text Available Nowadays, it is still difficult for molecularly imprinted polymers (MIPs to achieve homogeneous recognition since they cannot be easily dissolved in organic or aqueous phase. To address this issue, soluble molecularly imprinted nanorods have been synthesized by using soluble polyaniline doped with a functionalized organic protonic acid as the polymer matrix. By employing 1-naphthoic acid as a model, the proposed imprinted nanorods exhibit an excellent solubility and good homogeneous recognition ability. The imprinting factor for the soluble imprinted nanoroads is 6.8. The equilibrium dissociation constant and the apparent maximum number of the proposed imprinted nanorods are 248.5 μM and 22.1 μmol/g, respectively. We believe that such imprinted nanorods may provide an appealing substitute for natural receptors in homogeneous recognition related fields.

  15. Soluble Molecularly Imprinted Nanorods for Homogeneous Molecular Recognition

    Science.gov (United States)

    Liang, Rongning; Wang, Tiantian; Zhang, Huan; Yao, Ruiqing; Qin, Wei

    2018-03-01

    Nowadays, it is still difficult for molecularly imprinted polymer (MIPs) to achieve homogeneous recognition since they cannot be easily dissolved in organic or aqueous phase. To address this issue, soluble molecularly imprinted nanorods have been synthesized by using soluble polyaniline doped with a functionalized organic protonic acid as the polymer matrix. By employing 1-naphthoic acid as a model, the proposed imprinted nanorods exhibit an excellent solubility and good homogeneous recognition ability. The imprinting factor for the soluble imprinted nanoroads is 6.8. The equilibrium dissociation constant and the apparent maximum number of the proposed imprinted nanorods are 248.5 μM and 22.1 μmol/g, respectively. We believe that such imprinted nanorods may provide an appealing substitute for natural receptors in homogeneous recognition related fields.

  16. Homogeneity Study of UO2 Pellet Density for Quality Control

    International Nuclear Information System (INIS)

    Moon, Je Seon; Park, Chang Je; Kang, Kwon Ho; Moon, Heung Soo; Song, Kee Chan

    2005-01-01

    A homogeneity study has been performed with various densities of UO 2 pellets as the work of a quality control. The densities of the UO 2 pellets are distributed randomly due to several factors such as the milling conditions and sintering environments, etc. After sintering, total fourteen bottles were chosen for UO 2 density and each bottle had three samples. With these bottles, the between-bottle and within-bottle homogeneity were investigated via the analysis of the variance (ANOVA). From the results of ANOVA, the calculated F-value is used to determine whether the distribution is accepted or rejected from the view of a homogeneity under a certain confidence level. All the homogeneity checks followed the International Standard Guide 35

  17. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    Science.gov (United States)

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  18. Homogeneous Finsler Spaces

    CERN Document Server

    Deng, Shaoqiang

    2012-01-01

    "Homogeneous Finsler Spaces" is the first book to emphasize the relationship between Lie groups and Finsler geometry, and the first to show the validity in using Lie theory for the study of Finsler geometry problems. This book contains a series of new results obtained by the author and collaborators during the last decade. The topic of Finsler geometry has developed rapidly in recent years. One of the main reasons for its surge in development is its use in many scientific fields, such as general relativity, mathematical biology, and phycology (study of algae). This monograph introduc

  19. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization.

    Science.gov (United States)

    Kwiatkowski, M; Wurlitzer, M; Krutilin, A; Kiani, P; Nimer, R; Omidi, M; Mannaa, A; Bussmann, T; Bartkowiak, K; Kruber, S; Uschold, S; Steffen, P; Lübberstedt, J; Küpker, N; Petersen, H; Knecht, R; Hansen, N O; Zarrine-Afsar, A; Robertson, W D; Miller, R J D; Schlüter, H

    2016-02-16

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  20. The Perron-Frobenius theorem for multi-homogeneous mappings

    OpenAIRE

    Gautier, Antoine; Tudisco, Francesco; Hein, Matthias

    2018-01-01

    The Perron-Frobenius theory for nonnegative matrices has been generalized to order-preserving homogeneous mappings on a cone and more recently to nonnegative multilinear forms. We unify both approaches by introducing the concept of order-preserving multi-homogeneous mappings, their associated nonlinear spectral problems and spectral radii. We show several Perron-Frobenius type results for these mappings addressing existence, uniqueness and maximality of nonnegative and positive eigenpairs. We...

  1. Ground observations and remote sensing data for integrated modelisation of water budget in the Merguellil catchment, Tunisia

    Science.gov (United States)

    Mougenot, Bernard

    2016-04-01

    The Mediterranean region is affected by water scarcity. Some countries as Tunisia reached the limit of 550 m3/year/capita due overexploitation of low water resources for irrigation, domestic uses and industry. A lot of programs aim to evaluate strategies to improve water consumption at regional level. In central Tunisia, on the Merguellil catchment, we develop integrated water resources modelisations based on social investigations, ground observations and remote sensing data. The main objective is to close the water budget at regional level and to estimate irrigation and water pumping to test scenarios with endusers. Our works benefit from French, bilateral and European projects (ANR, MISTRALS/SICMed, FP6, FP7…), GMES/GEOLAND-ESA) and also network projects as JECAM and AERONET, where the Merguellil site is a reference. This site has specific characteristics associating irrigated and rainfed crops mixing cereals, market gardening and orchards and will be proposed as a new environmental observing system connected to the OMERE, TENSIFT and OSR systems respectively in Tunisia, Morocco and France. We show here an original and large set of ground and remote sensing data mainly acquired from 2008 to present to be used for calibration/validation of water budget processes and integrated models for present and scenarios: - Ground data: meteorological stations, water budget at local scale: fluxes tower, soil fluxes, soil and surface temperature, soil moisture, drainage, flow, water level in lakes, aquifer, vegetation parameters on selected fieds/month (LAI, height, biomass, yield), land cover: 3 times/year, bare soil roughness, irrigation and pumping estimations, soil texture. - Remote sensing data: remote sensing products from multi-platform (MODIS, SPOT, LANDSAT, ASTER, PLEIADES, ASAR, COSMO-SkyMed, TerraSAR X…), multi-wavelength (solar, micro-wave and thermal) and multi-resolution (0.5 meters to 1 km). Ground observations are used (1) to calibrate soil

  2. Heterogenization of Homogeneous Catalysts: the Effect of the Support

    Energy Technology Data Exchange (ETDEWEB)

    Earl, W.L.; Ott, K.C.; Hall, K.A.; de Rege, F.M.; Morita, D.K.; Tumas, W.; Brown, G.H.; Broene, R.D.

    1999-06-29

    We have studied the influence of placing a soluble, homogeneous catalyst onto a solid support. We determined that such a 'heterogenized' homogeneous catalyst can have improved activity and selectivity for the asymmetric hydrogenation of enamides to amino acid derivatives. The route of heterogenization of RhDuPhos(COD){sup +} cations occurs via electrostatic interactions with anions that are capable of strong hydrogen bonding to silica surfaces. This is a novel approach to supported catalysis. Supported RhDuPhos(COD){sup +} is a recyclable, non-leaching catalyst in non-polar media. This is one of the few heterogenized catalysts that exhibits improved catalytic performance as compared to its homogeneous analog.

  3. Homogenization and structural topology optimization theory, practice and software

    CERN Document Server

    Hassani, Behrooz

    1999-01-01

    Structural topology optimization is a fast growing field that is finding numerous applications in automotive, aerospace and mechanical design processes. Homogenization is a mathematical theory with applications in several engineering problems that are governed by partial differential equations with rapidly oscillating coefficients Homogenization and Structural Topology Optimization brings the two concepts together and successfully bridges the previously overlooked gap between the mathematical theory and the practical implementation of the homogenization method. The book is presented in a unique self-teaching style that includes numerous illustrative examples, figures and detailed explanations of concepts. The text is divided into three parts which maintains the book's reader-friendly appeal.

  4. Control rod homogenization in heterogeneous sodium-cooled fast reactors

    International Nuclear Information System (INIS)

    Andersson, Mikael

    2016-01-01

    The sodium-cooled fast reactor is one of the candidates for a sustainable nuclear reactor system. In particular, the French ASTRID project employs an axially heterogeneous design, proposed in the so-called CFV (low sodium effect) core, to enhance the inherent safety features of the reactor. This thesis focuses on the accurate modeling of the control rods, through the homogenization method. The control rods in a sodium-cooled fast reactor are used for reactivity compensation during the cycle, power shaping, and to shutdown the reactor. In previous control rod homogenization procedures, only a radial description of the geometry was implemented, hence the axially heterogeneous features of the CFV core could not be taken into account. This thesis investigates the different axial variations the control rod experiences in a CFV core, to determine the impact that these axial environments have on the control rod modeling. The methodology used in this work is based on previous homogenization procedures, the so-called equivalence procedure. The procedure was newly implemented in the PARIS code system in order to be able to use 3D geometries, and thereby be take axial effects into account. The thesis is divided into three parts. The first part investigates the impact of different neutron spectra on the homogeneous control-rod cross sections. The second part investigates the cases where the traditional radial control-rod homogenization procedure is no longer applicable in the CFV core, which was found to be 5-10 cm away from any material interface. In the third part, based on the results from the second part, a 3D model of the control rod is used to calculate homogenized control-rod cross sections. In a full core model, a study is made to investigate the impact these axial effects have on control rod-related core parameters, such as the control rod worth, the capture rates in the control rod, and the power in the adjacent fuel assemblies. All results were compared to a Monte

  5. Testing Homogeneity with the Galaxy Fossil Record

    CERN Document Server

    Hoyle, Ben; Jimenez, Raul; Heavens, Alan; Clarkson, Chris; Maartens, Roy

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past lightcone, while observations take place on the lightcone. The history of star formation rates (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked Luminous Red Galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal area contiguous sky patches and 10 redshift slices (0.2homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is n...

  6. Homogeneous Buchberger algorithms and Sullivant's computational commutative algebra challenge

    DEFF Research Database (Denmark)

    Lauritzen, Niels

    2005-01-01

    We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge.......We give a variant of the homogeneous Buchberger algorithm for positively graded lattice ideals. Using this algorithm we solve the Sullivant computational commutative algebra challenge....

  7. Homogenized description and retrieval method of nonlinear metasurfaces

    Science.gov (United States)

    Liu, Xiaojun; Larouche, Stéphane; Smith, David R.

    2018-03-01

    A patterned, plasmonic metasurface can strongly scatter incident light, functioning as an extremely low-profile lens, filter, reflector or other optical device. When the metasurface is patterned uniformly, its linear optical properties can be expressed using effective surface electric and magnetic polarizabilities obtained through a homogenization procedure. The homogenized description of a nonlinear metasurface, however, presents challenges both because of the inherent anisotropy of the medium as well as the much larger set of potential wave interactions available, making it challenging to assign effective nonlinear parameters to the otherwise inhomogeneous layer of metamaterial elements. Here we show that a homogenization procedure can be developed to describe nonlinear metasurfaces, which derive their nonlinear response from the enhanced local fields arising within the structured plasmonic elements. With the proposed homogenization procedure, we are able to assign effective nonlinear surface polarization densities to a nonlinear metasurface, and link these densities to the effective nonlinear surface susceptibilities and averaged macroscopic pumping fields across the metasurface. These effective nonlinear surface polarization densities are further linked to macroscopic nonlinear fields through the generalized sheet transition conditions (GSTCs). By inverting the GSTCs, the effective nonlinear surface susceptibilities of the metasurfaces can be solved for, leading to a generalized retrieval method for nonlinear metasurfaces. The application of the homogenization procedure and the GSTCs are demonstrated by retrieving the nonlinear susceptibilities of a SiO2 nonlinear slab. As an example, we investigate a nonlinear metasurface which presents nonlinear magnetoelectric coupling in near infrared regime. The method is expected to apply to any patterned metasurface whose thickness is much smaller than the wavelengths of operation, with inclusions of arbitrary geometry

  8. Geant4 simulation of the Elekta XVI kV CBCT unit for accurate description of potential late toxicity effects of image-guided radiotherapy

    International Nuclear Information System (INIS)

    Brochu, F M; Burnet, N G; Jena, R; Plaistow, R; Thomas, S J; Parker, M A

    2014-01-01

    This paper describes the modelisation of the Elekta XVI Cone Beam Computed Tomography (CBCT) machine components with Geant4 and its validation against calibration data taken for two commonly used machine setups. Preliminary dose maps of simulated CBCTs coming from this modelisation work are presented. This study is the first step of a research project, GHOST, aiming to improve the understanding of late toxicity risk in external beam radiotherapy patients by simulating dose depositions integrated from different sources (imaging, treatment beam) over the entire treatment plan. The second cancer risk will then be derived from different models relating irradiation dose and second cancer risk. (paper)

  9. Numerical analysis for Darcy-Forchheimer flow in presence of homogeneous-heterogeneous reactions

    Directory of Open Access Journals (Sweden)

    Muhammad Ijaz Khan

    Full Text Available A mathematical study is presented to investigate the influences of homogeneous and heterogeneous reactions in local similar flow caused by stretching sheet with a non-linear velocity and variable thickness. Porous medium effects are characterized by using Darcy-Forchheimer porous-media. A simple isothermal model of homogeneous-heterogeneous reactions is used. The multiphysical boundary value problem is dictated by ten thermophysical parameters: ratio of mass diffusion coefficients, Prandtl number, local inertia coefficient parameter, inverse Darcy number, shape parameter, surface thickness parameter, Hartman number, Homogeneous heat reaction, strength of homogeneous-heterogeneous reactions and Schmidt number. Resulting systems are computed by Runge-Kutta-Fehlberg method. Different shapes of velocity are noticed for n > 1 and n < 1. Keywords: Homogeneous-heterogeneous reactions, Non Darcy porous medium, Variable sheet thickness, Homogeneous heat reaction with stoichiometric coefficient, Runge-Kutta-Fehlberg method

  10. Non-linear waves in heterogeneous elastic rods via homogenization

    KAUST Repository

    Quezada de Luna, Manuel

    2012-03-01

    We consider the propagation of a planar loop on a heterogeneous elastic rod with a periodic microstructure consisting of two alternating homogeneous regions with different material properties. The analysis is carried out using a second-order homogenization theory based on a multiple scale asymptotic expansion. © 2011 Elsevier Ltd. All rights reserved.

  11. Computational Method for Atomistic-Continuum Homogenization

    National Research Council Canada - National Science Library

    Chung, Peter

    2002-01-01

    The homogenization method is used as a framework for developing a multiscale system of equations involving atoms at zero temperature at the small scale and continuum mechanics at the very large scale...

  12. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  13. Effects of high-speed homogenization and high-pressure homogenization on structure of tomato residue fibers.

    Science.gov (United States)

    Hua, Xiao; Xu, Shanan; Wang, Mingming; Chen, Ying; Yang, Hui; Yang, Ruijin

    2017-10-01

    Tomato residue fibers obtained after derosination and deproteinization were processed by high-speed homogenization (HSH) and high-pressure homogenization (HPH), and their effects on fiber structure was investigated, respectively. Characterizations including particle size distribution, SEM, TEM and XRD were performed. HSH could break raw fibers to small particles of around 60μm, while HPH could reshape fibers to build network structure. Microfibrils were released and their nanostructure consisting of elementary fibrils was observed by TEM. XRD patterns indicated both HSH and HPH could hardly alter the nanostructure of the fibers. Physicochemical properties including expansibility, WHC and OHC were determined. Both HSH and HPH could increase the soluble fiber content by about 8%, but HSH-HPH combined processing did not show better result. Acid (4mol/L HCl) was used in replacement of water medium and the acidic degradation of fibers could be promoted by high speed shearing or high pressure processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Commensurability effects in holographic homogeneous lattices

    International Nuclear Information System (INIS)

    Andrade, Tomas; Krikun, Alexander

    2016-01-01

    An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as “homogeneous holographic lattices.' Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities. However, it is not clear whether they are able to capture other lattice effects which are of interest in condensed matter. In this paper we investigate this question focusing our attention on the phenomenon of commensurability, which arises when the lattice scale is tuned to be equal to (an integer multiple of) another momentum scale in the system. We do so by studying the formation of spatially modulated phases in various models of homogeneous holographic lattices. Our results indicate that the onset of the instability is controlled by the near horizon geometry, which for insulating solutions does carry information about the lattice. However, we observe no sharp connection between the characteristic momentum of the broken phase and the lattice pitch, which calls into question the applicability of these models to the physics of commensurability.

  15. Some properties of spatially homogeneous spacetimes

    International Nuclear Information System (INIS)

    Coomer, G.C.

    1979-01-01

    This paper discusses two features of the universe which are influenced in a fundamental way by the spacetime geometry of the universe. The first is the growth of density fluctuations in the early stages of the evolution of the universe. The second is the propagation of electromagnetic radiation in the universe. A spatially homogeneous universe is assumed in both discussions. The gravitational instability theory of galaxy formation is investigated for a viscous fluid and for a charged, conducting fluid with a magnetic field added as a perturbation. It is found that the growth rate of density perturbations in both cases is lower than in the perfect fluid case. Spatially homogeneous but nonisotropic spacetimes are investigated next. Two perfect fluid solutions of Einstein's field equations are found which have spacelike hypersurfaces with Bianchi type II geometry. An expression for the spectrum of the cosmic microwave background radiation in a spatially homogeneous but nonisotropic universe is found. The expression is then used to determine the angular distribution of the intensity of the radiation in the simpler of the two solutions. When accepted values of the matter density and decoupling temperature are inserted into this solution, values for the age of the universe and the time of decoupling are obtained which agree reasonably well with the values of the standard model of the universe

  16. Stochastic model of milk homogenization process using Markov's chain

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2016-01-01

    Full Text Available The process of development of a mathematical model of the process of homogenization of dairy products is considered in the work. The theory of Markov's chains was used in the development of the mathematical model, Markov's chain with discrete states and continuous parameter for which the homogenisation pressure is taken, being the basis for the model structure. Machine realization of the model is implemented in the medium of structural modeling MathWorks Simulink™. Identification of the model parameters was carried out by minimizing the standard deviation calculated from the experimental data for each fraction of dairy products fat phase. As the set of experimental data processing results of the micrographic images of fat globules of whole milk samples distribution which were subjected to homogenization at different pressures were used. Pattern Search method was used as optimization method with the Latin Hypercube search algorithm from Global Optimization Тoolbox library. The accuracy of calculations averaged over all fractions of 0.88% (the relative share of units, the maximum relative error was 3.7% with the homogenization pressure of 30 MPa, which may be due to the very abrupt change in properties from the original milk in the particle size distribution at the beginning of the homogenization process and the lack of experimental data at homogenization pressures of below the specified value. The mathematical model proposed allows to calculate the profile of volume and mass distribution of the fat phase (fat globules in the product, depending on the homogenization pressure and can be used in the laboratory and research of dairy products composition, as well as in the calculation, design and modeling of the process equipment of the dairy industry enterprises.

  17. Homogenization of compacted blends of Ni and Mo powders

    International Nuclear Information System (INIS)

    Lanam, R.D.; Yeh, F.C.H.; Rovsek, J.E.; Smith, D.W.; Heckel, R.W.

    1975-01-01

    The homogenization behavior of compacted blends of Ni and Mo powders was studied primarily as a function of temperature, mean compact composition, and Mo powder particle size. All compact compositions were in the Ni-rich terminal solid-solution range; temperatures were between 950 and 1200 0 C (in the region of the phase diagram where only the Mo--Ni intermediate phase forms); average Mo particle sizes ranged from 8.4 mu m to 48 mu m. Homogenization was characterized in terms of the rate of decrease of the amounts of the Mo-rich terminal solid-solution phase and the Mo--Ni intermediate phase. The experimental results were compared to predictions based upon the three-phase, concentric-sphere homogenization model. In general, agreement between experimental data and model predictions was fairly good for high-temperature treatments and for compact compositions which were not close to the solubility limit of Mo in Ni. Departures from the model are discussed in terms of surface diffusion contributions to homogenization and non-uniform mixing effects. (U.S.)

  18. Suggestion for a homogenizer installation in LOFT small break two-phase measurement

    International Nuclear Information System (INIS)

    Rieger, G.

    1981-07-01

    The purpose of this task, which was performed as an Austrian inkind contribution for the INEL research program is a) the evaluation of literature concerning homogenizers to improve two phase flow measurements for the LOFT small break test series, b) design of a homogenizer and c) recommandation of the location of a homogenizer in the LOFT piping system. To optimize the location of the homogenizer LTSF-tests should be performed according to the suggestions in this paper. (author)

  19. Effect of homogenization on the properties and microstructure of Mozzarella cheese from buffalo milk.

    Science.gov (United States)

    Abd El-Gawad, Mona A M; Ahmed, Nawal S; El-Abd, M M; Abd El-Rafee, S

    2012-04-02

    The name pasta filata refers to a unique plasticizing and texturing treatments of the fresh curd in hot water that imparts to the finished cheese its characteristic fibrous structure and melting properties. Mozzarella cheese made from standardized homogenized and non-homogenized buffalo milk with 3 and 1.5%fat. The effect of homogenization on rheological, microstructure and sensory evaluation was carried out. Fresh raw buffalo milk and starter cultures of Streptococcus thermophilus and Lactobacillus delbrueckii ssp. bulgaricus were used. The coagulants were calf rennet powder (HA-LA). Standardized buffalo milk was homogenized at 25 kg/cm2 pressure after heating to 60°C using homogenizer. Milk and cheese were analysed. Microstructure of the cheese samples was investigated either with an application of transmission or scanning electron microscope. Statistical analyses were applied on the obtained data. Soluble nitrogen total volatile free fatty acids, soluble tyrosine and tryptophan increased with using homogenized milk and also, increased with relatively decrease in case of homogenized Mozzarella cheese. Meltability of Mozzarella cheese increased with increasing the fat content and storage period and decrease with homogenization. Mozzarella cheese firmness increased with homogenization and also, increased with progressing of storage period. Flavour score, appearance and total score of Mozzarella cheese increased with homogenization and storage period progress, while body and texture score decreased with homogenization and increased with storage period progress. Microstructure of Mozzarella cheese showed the low fat cheese tends to be harder, more crumbly and less smooth than normal. Curd granule junctions were prominent in non-homogenized milk cheese. Homogenization of milk cheese caused changes in the microstructure of the Mozzarella cheese. Microstructure studies of cheese revealed that cheese made from homogenized milk is smoother and has a finer texture than

  20. Fabrication and characterization of uranium-6--niobium alloy plate with improved homogeneity

    International Nuclear Information System (INIS)

    Snyder, W.B.

    1978-01-01

    Chemical inhomogeneities produced during arc melting of uranium--6 weight percent niobium alloy normally persist during fabrication of the ingot to a finished product. An investigation was directed toward producing a more homogeneous product (approx. 13.0-mm plate) by a combination of mechanical working and homogenization. Ingots were cast, forged to various reductions, homogenized under different conditions, and finally rolled to 13.0-mm-thick plate. It was concluded that increased forging reductions prior to homogenization resulted in a more homogeneous plate. Comparison of calculated and experimentally measured niobium concentration profiles indicated that the activation energy for the diffusion of niobium in uranium--niobium alloys may be lower than previously observed

  1. Superfluid transition of homogeneous and trapped two-dimensional Bose gases.

    Science.gov (United States)

    Holzmann, Markus; Baym, Gordon; Blaizot, Jean-Paul; Laloë, Franck

    2007-01-30

    Current experiments on atomic gases in highly anisotropic traps present the opportunity to study in detail the low temperature phases of two-dimensional inhomogeneous systems. Although, in an ideal gas, the trapping potential favors Bose-Einstein condensation at finite temperature, interactions tend to destabilize the condensate, leading to a superfluid Kosterlitz-Thouless-Berezinskii phase with a finite superfluid mass density but no long-range order, as in homogeneous fluids. The transition in homogeneous systems is conveniently described in terms of dissociation of topological defects (vortex-antivortex pairs). However, trapped two-dimensional gases are more directly approached by generalizing the microscopic theory of the homogeneous gas. In this paper, we first derive, via a diagrammatic expansion, the scaling structure near the phase transition in a homogeneous system, and then study the effects of a trapping potential in the local density approximation. We find that a weakly interacting trapped gas undergoes a Kosterlitz-Thouless-Berezinskii transition from the normal state at a temperature slightly below the Bose-Einstein transition temperature of the ideal gas. The characteristic finite superfluid mass density of a homogeneous system just below the transition becomes strongly suppressed in a trapped gas.

  2. Thermal design of horizontal tube boilers. Numerical and experimental investigation; Modelisation thermique de bouilleurs a tubes horizontaux. Etude numerique et validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    Roser, R.

    1999-11-26

    This work concerns the thermal design of kettle reboilers. Current methods are highly inaccurate, regarded to the correlations for external heat transfer coefficient at one tube scale, as well as to two-phase flow modelling at boiler scale. The aim of this work is to improve these thermal design methods. It contains an experimental investigation with typical operating conditions of such equipment: an hydrocarbon (n-pentane) with low mass flux. This investigation has lead to characterize the local flow pattern through void fraction measurements and, from this, to develop correlations for void fraction, pressure drop and heat transfer coefficient. The approach is original, since the developed correlations are based on the liquid velocity at minimum cross section area between tubes, as variable characterizing the hydrodynamic effects on pressure drop and heat transfer coefficient. These correlations are shown to give much better results than those suggested up to now in the literature, which are empirical transpositions from methods developed for inside tube flows. Furthermore, the numerical code MC3D has been applied using the correlations developed in this work, leading to a modeling of the two-phase flow in the boiler, which is a significant progress compared to current simplified methods. (author)

  3. The Homogeneous Interior-Point Algorithm: Nonsymmetric Cones, Warmstarting, and Applications

    DEFF Research Database (Denmark)

    Skajaa, Anders

    algorithms for these problems is still limited. The goal of this thesis is to investigate and shed light on two computational aspects of homogeneous interior-point algorithms for convex conic optimization: The first part studies the possibility of devising a homogeneous interior-point method aimed at solving...... problems involving constraints that require nonsymmetric cones in their formulation. The second part studies the possibility of warmstarting the homogeneous interior-point algorithm for conic problems. The main outcome of the first part is the introduction of a completely new homogeneous interior......-point algorithm designed to solve nonsymmetric convex conic optimization problems. The algorithm is presented in detail and then analyzed. We prove its convergence and complexity. From a theoretical viewpoint, it is fully competitive with other algorithms and from a practical viewpoint, we show that it holds lots...

  4. Immobilised Homogeneous Catalysts for Sequential Fine Chemical Synthesis : Functionalised Organometallics for Nanotechnology

    NARCIS (Netherlands)

    McDonald, A.R.

    2008-01-01

    The work described in this thesis has demonstrated the application of heterogenised homogeneous catalysts. We have shown that by coupling a homogeneous catalyst to a heterogeneous support we could combine the benefits of two major fields of catalysis: retain the high selectivity of homogeneous

  5. Applications of high and ultra high pressure homogenization for food safety

    Directory of Open Access Journals (Sweden)

    Francesca Patrignani

    2016-08-01

    Full Text Available Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time (LTLT and high temperature short time (HTST treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure (HHP, pulsed electric field (PEF, ultrasound (US and high pressure homogenization (HPH. This last technique has been demonstrated to have a great potential to provide fresh-like products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of high pressure homogenization against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered

  6. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan

    2016-06-06

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  7. Homogeneous Biosensing Based on Magnetic Particle Labels

    Science.gov (United States)

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  8. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschö pe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  9. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.; Efendiev, Yalchin R.; Li, Guanglian; Savatorova, Viktoria

    2015-01-01

    , Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point

  10. Homogenization and Control of Lattice Structures

    National Research Council Canada - National Science Library

    Blankenship, G. L

    1985-01-01

    ...., trusses may be modeled by beam equations). Using a technique from the mathematics of asymptotic analysis called "homogenization," the author shows how such approximations may be derived in a systematic way that avoids errors made using...

  11. Tests for homogeneity for multiple 2 x 2 contingency tables

    International Nuclear Information System (INIS)

    Carr, D.B.

    1986-01-01

    Frequently data are described by 2 x 2 contingency tables. For example, each 2 x 2 table arises from two dichotomous classifications such as control/treated and respond/did not respond. Multiple 2 x 2 tables result from stratifying the observational units on the basis of other characteristics. For example, stratifying by sex produces separate 2 x 2 tables for males and females. From each table a measure of difference between the response rates for the control and the treated groups is computed. The researcher usually wants to know if the response-rate difference is zero for each table. If the tables are homogeneous, the researcher can generalize from a statement concerning an average to a statement concerning each table. If tables are not homogeneous, homogeneous subsets of the tables should be described separately. This paper presents tests for homogeneity and illustrates their use. 11 refs., 6 tabs

  12. Homogeneous-heterogeneous reactions in curved channel with porous medium

    Science.gov (United States)

    Hayat, T.; Ayub, Sadia; Alsaedi, A.

    2018-06-01

    Purpose of the present investigation is to examine the peristaltic flow through porous medium in a curved conduit. Problem is modeled for incompressible electrically conducting Ellis fluid. Influence of porous medium is tackled via modified Darcy's law. The considered model utilizes homogeneous-heterogeneous reactions with equal diffusivities for reactant and autocatalysis. Constitutive equations are formulated in the presence of viscous dissipation. Channel walls are compliant in nature. Governing equations are modeled and simplified under the assumptions of small Reynolds number and large wavelength. Graphical results for velocity, temperature, heat transfer coefficient and homogeneous-heterogeneous reaction parameters are examined for the emerging parameters entering into the problem. Results reveal an activation in both homogenous-heterogenous reaction effect and heat transfer rate with increasing curvature of the channel.

  13. Radiotracer application in determining changes in cement mix homogeneity

    International Nuclear Information System (INIS)

    Breda, M.

    1979-01-01

    A small amount of cement labelled with 24 Na is added to the concrete mix and the relative activity of the mix is measured using a scintillation detector in preset points at different time intervals of the mixing process. The detector picks up information from a volume of 10 to 15 litres. The values characterize the degree of homogeneity of the cement component in the mix. Mathematical statistics methods are used for assessing mixing or the homogeneity changes. The technique is quick and simple and is used to advantage in determining the effect of the duration and method of transport of the cement mix on its homogeneity, and in monitoring the mixing process and determining the minimum mixing time for all types of concrete mix. (M.S.)

  14. Homogeneous nucleation in 4He: A corresponding-states analysis

    International Nuclear Information System (INIS)

    Sinha, D.N.; Semura, J.S.; Brodie, L.C.

    1982-01-01

    We report homogeneous-nucleation-temperature measurements in liquid 4 He over a bath-temperature range 2.31 4 He, in a region far from the critical point. A simple empirical form is presented for estimating the homogeneous nucleation temperatures for any liquid with a spherically symmetric interatomic potential. The 4 He data are compared with nucleation data for Ar, Kr, Xe, and H; theoretical predictions for 3 He are given in terms of reduced quantities. It is shown that the nucleation data for both quantum and classical liquids obey a quantum law of corresponding states (QCS). On the basis of this QCS analysis, predictions of homogeneous nucleation temperatures are made for hydrogen isotopes such as HD, DT, HT, and T 2

  15. Homogenization technique for strongly heterogeneous zones in research reactors

    International Nuclear Information System (INIS)

    Lee, J.T.; Lee, B.H.; Cho, N.Z.; Oh, S.K.

    1991-01-01

    This paper reports on an iterative homogenization method using transport theory in a one-dimensional cylindrical cell model developed to improve the homogenized cross sections fro strongly heterogeneous zones in research reactors. The flux-weighting homogenized cross sections are modified by a correction factor, the cell flux ratio under an albedo boundary condition. The albedo at the cell boundary is iteratively determined to reflect the geometry effects of the material properties of the adjacent cells. This method has been tested with a simplified core model of the Korea Multipurpose Research Reactor. The results demonstrate that the reaction rates of an off-center control shroud cell, the multiplication factor, and the power distribution of the reactor core are close to those of the fine-mesh heterogeneous transport model

  16. Homogeneous scintillating LKr/Xe calorimeters

    International Nuclear Information System (INIS)

    Chen, M.; Mullins, M.; Pelly, D.; Shotkin, S.; Sumorok, K.; Akyuz, D.; Chen, E.; Gaudreau, M.P.J.; Bolozdynya, A.; Tchernyshev, V.; Goritchev, P.; Khovansky, V.; Koutchenkov, A.; Kovalenko, A.; Lebedenko, V.; Vinogradov, V.; Gusev, L.; Sheinkman, V.; Krasnokutsky, R.N.; Shuvalov, R.S.; Fedyakin, N.N.; Sushkov, V.; Akopyan, M.; Doke, T.; Kikuchi, J.; Hitachi, A.; Kashiwagi, T.; Masuda, K.; Shibamura, E.; Ishida, N.; Sugimoto, S.

    1993-01-01

    Recent R and D work on full length scintillating homogeneous liquid xenon/krypton (LXe/Kr) cells has established the essential properties for precision EM calorimeters: In-situ calibration using α's, radiation hardness as well as the uniformity required for δE/E≅0.5% for e/γ's above 50 GeV. (orig.)

  17. Numerical computing of elastic homogenized coefficients for periodic fibrous tissue

    Directory of Open Access Journals (Sweden)

    Roman S.

    2009-06-01

    Full Text Available The homogenization theory in linear elasticity is applied to a periodic array of cylindrical inclusions in rectangular pattern extending to infinity in the inclusions axial direction, such that the deformation of tissue along this last direction is negligible. In the plane of deformation, the homogenization scheme is based on the average strain energy whereas in the third direction it is based on the average normal stress along this direction. Namely, these average quantities have to be the same on a Repeating Unit Cell (RUC of heterogeneous and homogenized media when using a special form of boundary conditions forming by a periodic part and an affine part of displacement. It exists an infinity of RUCs generating the considered array. The computing procedure is tested with different choices of RUC to control that the results of the homogenization process are independent of the kind of RUC we employ. Then, the dependence of the homogenized coefficients on the microstructure can be studied. For instance, a special anisotropy and the role of the inclusion volume are investigated. In the second part of this work, mechanical traction tests are simulated. We consider two kinds of loading, applying a density of force or imposing a displacement. We test five samples of periodic array containing one, four, sixteen, sixty-four and one hundred of RUCs. The evolution of mean stresses, strains and energy with the numbers of inclusions is studied. Evolutions depend on the kind of loading, but not their limits, which could be predicted by simulating traction test of the homogenized medium.

  18. Pyroxene Homogenization and the Isotopic Systematics of Eucrites

    Science.gov (United States)

    Nyquist, L. E.; Bogard, D. D.

    1996-01-01

    The original Mg-Fe zoning of eucritic pyroxenes has in nearly all cases been partly homogenized, an observation that has been combined with other petrographic and compositional criteria to establish a scale of thermal "metamorphism" for eucrites. To evaluate hypotheses explaining development of conditions on the HED parent body (Vesta?) leading to pyroxene homogenization against their chronological implications, it is necessary to know whether pyroxene metamorphism was recorded in the isotopic systems. However, identifying the effects of the thermal metamorphism with specific effects in the isotopic systems has been difficult, due in part to a lack of correlated isotopic and mineralogical studies of the same eucrites. Furthermore, isotopic studies often place high demands on analytical capabilities, resulting in slow growth of the isotopic database. Additionally, some isotopic systems would not respond in a direct and sensitive way to pyroxene homogenization. Nevertheless, sufficient data exist to generalize some observations, and to identify directions of potentially fruitful investigations.

  19. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhijie, E-mail: zhijie.xu@pnnl.gov [Computational Mathematics Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Joshi, Vineet [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Hu, Shenyang [Reactor Materials & Mechanical Design, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Paxton, Dean [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lavender, Curt [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Burkes, Douglas [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2016-04-01

    Low-enriched U-22at% Mo (U–10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U–10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  20. Homogenization of linearly anisotropic scattering cross sections in a consistent B1 heterogeneous leakage model

    International Nuclear Information System (INIS)

    Marleau, G.; Debos, E.

    1998-01-01

    One of the main problems encountered in cell calculations is that of spatial homogenization where one associates to an heterogeneous cell an homogeneous set of cross sections. The homogenization process is in fact trivial when a totally reflected cell without leakage is fully homogenized since it involved only a flux-volume weighting of the isotropic cross sections. When anisotropic leakages models are considered, in addition to homogenizing isotropic cross sections, the anisotropic scattering cross section must also be considered. The simple option, which consists of using the same homogenization procedure for both the isotropic and anisotropic components of the scattering cross section, leads to inconsistencies between the homogeneous and homogenized transport equation. Here we will present a method for homogenizing the anisotropic scattering cross sections that will resolve these inconsistencies. (author)

  1. Relativistic cosmologies with closed, locally homogeneous space sections

    International Nuclear Information System (INIS)

    Fagundes, H.V.

    1985-01-01

    The homogeneous Bianchi and Kantowski-Sachs metrics of relativistic cosmology are investigated through their correspondence with recent geometrical results of Thurston. These allow a partial classification of the topologies for closed, locally homogeneous spaces according to Thurston's eight geometric types. Besides, which of the Bianchi-Kantowski-Sachs metrics can be imposed on closed space sections of cosmological models are learned. This is seen as a progress toward implementation of a postulate of the closure of space for both classical and quantum gravity. (Author) [pt

  2. Homogeneous scintillating LKr/Xe calorimeters

    Energy Technology Data Exchange (ETDEWEB)

    Chen, M.; Mullins, M.; Pelly, D.; Shotkin, S.; Sumorok, K. (Lab. for Nuclear Science, MIT, Cambridge, MA (United States)); Akyuz, D.; Chen, E.; Gaudreau, M.P.J. (Plasma Fusion Center, MIT, Cambridge, MA (United States)); Bolozdynya, A.; Tchernyshev, V.; Goritchev, P.; Khovansky, V.; Koutchenkov, A.; Kovalenko, A.; Lebedenko, V.; Vinogradov, V.; Gusev, L.; Sheinkman, V. (ITEP, Moscow (Russia)); Krasnokutsky, R.N.; Shuvalov, R.S.; Fedyakin, N.N.; Sushkov, V. (IHEP, Serpukhov (Russia)); Akopyan, M. (Inst. for Nuclear Research, Moscow (Russia)); Doke, T.; Kikuchi, J.; Hitachi, A.; Kashiwagi, T. (Science and Eng. Res. Lab., Waseda Univ., Tokyo (Japan)); Masuda, K.; Shibamura, E. (Saitama Coll. of Health (Japan)); Ishida, N. (Seikei Univ. (Japan)); Sugimoto, S. (INS, Univ. Tokyo (Japan))

    1993-03-20

    Recent R and D work on full length scintillating homogeneous liquid xenon/krypton (LXe/Kr) cells has established the essential properties for precision EM calorimeters: In-situ calibration using [alpha]'s, radiation hardness as well as the uniformity required for [delta]E/E[approx equal]0.5% for e/[gamma]'s above 50 GeV. (orig.).

  3. Hardness and microstructure homogeneity of pure copper processed by accumulative back extrusion

    International Nuclear Information System (INIS)

    Bazaz, B.; Zarei-Hanzaki, A.; Fatemi-Varzaneh, S.M.

    2013-01-01

    The present work deals with the microstructure evolution of a pure copper processed by a new severe plastic deformation method. A set of pure copper (99.99%) work-pieces with coarse-grained microstructures was processed by accumulative back extrusion (ABE) method at room temperature. The optical and scanning electron microscopy (SEM) and hardness measurements were utilized to study the microstructural evolution and hardness homogeneity. The results indicated that ABE is a capable process to provide a homogenous grain refined microstructure in pure copper. The observed grain refinement was discussed relying on the occurrence of dynamic restoration processes. The analysis of microstructure and hardness showed outstanding homogeneity improvement throughout the work-pieces as the consecutive ABE passes were applied. The homogeneity improvement was attributed to the propagation of the shear bands and also the heavily deformed regions. A reversing route was also applied in the ABE processing to investigate its effect on the development of microstructural homogeneity. Comparing to the conventional route, the application of the reversing route was found to yield better homogeneity after less passes of the process.

  4. Overview of homogeneous versus heterogeneous core configuration trade-off studies

    International Nuclear Information System (INIS)

    Chang, Y.I.

    1982-01-01

    The most significant development in core design trend in the U.S. LMFBR program has been the increased attention given to the heterogeneous core design concept. In recent years, numerous core configuration trade-off studies have been carried out to quantify advantages and disadvantages of the heterogeneous concept relative to the homogeneous concept, and a consensus is emerging among the U.S. core designers. It appears that the technical and economic performance differences between homogeneous and heterogeneous core designs are very small; however, the heterogeneous concept provides a definite safety/licensing advantage. he technical and economic performance comparison between homogeneous and heterogeneous core configurations is difficult to quantify. In fact, in most cases, the perceived advantages and/or disadvantages are dictated by the consistency in the comparison (optimized for one concept versus non-optimized for the other, etc.) rather than by any inherent differences. Some of the technical and economic issues relevant to the homogeneous versus heterogeneous comparison are summarized

  5. Overview of homogeneous versus heterogeneous core configuration trade-off studies

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y I [Applied Physics Division, Argonne National Laboratory, Argonne, IL (United States)

    1982-01-01

    The most significant development in core design trend in the U.S. LMFBR program has been the increased attention given to the heterogeneous core design concept. In recent years, numerous core configuration trade-off studies have been carried out to quantify advantages and disadvantages of the heterogeneous concept relative to the homogeneous concept, and a consensus is emerging among the U.S. core designers. It appears that the technical and economic performance differences between homogeneous and heterogeneous core designs are very small; however, the heterogeneous concept provides a definite safety/licensing advantage. he technical and economic performance comparison between homogeneous and heterogeneous core configurations is difficult to quantify. In fact, in most cases, the perceived advantages and/or disadvantages are dictated by the consistency in the comparison (optimized for one concept versus non-optimized for the other, etc.) rather than by any inherent differences. Some of the technical and economic issues relevant to the homogeneous versus heterogeneous comparison are summarized.

  6. Homogeneity spoil spectroscopy

    International Nuclear Information System (INIS)

    Hennig, J.; Boesch, C.; Martin, E.; Grutter, R.

    1987-01-01

    One of the problems of in vivo MR spectroscopy of P-31 is spectra localization. Surface coil spectroscopy, which is the method of choice for clinical applications, suffers from the high-intensity signal from subcutaneous muscle tissue, which masks the spectrum of interest from deeper structures. In order to suppress this signal while maintaining the simplicity of surface coil spectroscopy, the authors introduced a small sheet of ferromagnetically dotted plastic between the surface coil and the body. This sheet destroys locally the field homogeneity and therefore all signal from structures around the coil. The very high reproducibility of the simple experimental procedure allows long-term studies important for monitoring tumor therapy

  7. Modeling of turbulent bubbly flows; Modelisation des ecoulements turbulents a bulles

    Energy Technology Data Exchange (ETDEWEB)

    Bellakhal, Ghazi

    2005-03-15

    The two-phase flows involve interfacial interactions which modify significantly the structure of the mean and fluctuating flow fields. The design of the two-fluid models adapted to industrial flows requires the taking into account of the effect of these interactions in the closure relations adopted. The work developed in this thesis concerns the development of first order two-fluid models deduced by reduction of second order closures. The adopted reasoning, based on the principle of decomposition of the Reynolds stress tensor into two statistically independent contributions turbulent and pseudo-turbulent parts, allows to preserve the physical contents of the second order relations closure. Analysis of the turbulence structure in two basic flows: homogeneous bubbly flows uniform and with a constant shear allows to deduce a formulation of the two-phase turbulent viscosity involving the characteristic scales of bubbly turbulence, as well as an analytical description of modification of the homogeneous turbulence structure induced by the bubbles presence. The Eulerian two-fluid model was then generalized with the case of the inhomogeneous flows with low void fractions. The numerical results obtained by the application of this model integrated in the computer code MELODIF in the case of free sheared turbulent bubbly flow of wake showed a satisfactory agreement with the experimental data and made it possible to analyze the modification of the characteristic scales of such flow by the interfacial interactions. The two-fluid first order model is generalized finally with the case of high void fractions bubbly flows where the hydrodynamic interactions between the bubbles are not negligible any more. (author)

  8. Benchmarking homogenization algorithms for monthly data

    Czech Academy of Sciences Publication Activity Database

    Venema, V. K. C.; Mestre, O.; Aquilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertačník, G.; Szentimrey, T.; Štěpánek, Petr; Zahradníček, Pavel; Viarre, J.; Mueller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Duran, M. P.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    Roč. 8, č. 1 (2012), s. 89-115 ISSN 1814-9324 Institutional support: RVO:67179843 Keywords : climate data * instrumental time-series * greater alpine region * homogeneity test * variability * inhomogeneities Subject RIV: EH - Ecology, Behaviour Impact factor: 3.556, year: 2012

  9. Water Filtration through Homogeneous Granulated Charge

    Directory of Open Access Journals (Sweden)

    A. M. Krautsou

    2005-01-01

    Full Text Available General relationship for calculation of water filtration through homogeneous granulated charge has been obtained. The obtained relationship has been compared with experimental data. Discrepancies between calculated and experimental values do not exceed 6 % throughout the entire investigated range.

  10. The influence of non-homogenous dielectric material in the waveguide propagation modes

    Directory of Open Access Journals (Sweden)

    Ion VONCILA

    2006-12-01

    Full Text Available The aim of this paper is to indicate the equations of electromagnetic wave in homogenous and non-homogenous dielectric material, estabilising the bundary conditions and solves by FEM the equations of the electromagnetic wave in the rectangular cavity. By numeric simulation of the waveguide in the cavity there have been studied the modifications of both the ways of propagation and the field’s distribution. The non-homogenous mediums afectes the field’s amplitude, obtaining a non-homogenous distribution. Poyting vector of the wave’s transmision, indicates the energetic flux’s concentration in the air besides the dielectric material.

  11. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    Science.gov (United States)

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  12. Preparation of homogeneous isotopic targets with rotating substrate

    International Nuclear Information System (INIS)

    Xu, G.J.; Zhao, Z.G.

    1993-01-01

    Isotopically enriched accelerator targets were prepared using the evaporation-condensation method from a resistance heating crucible. For high collection efficiency and good homogeneity the substrate was rotated at a vertical distance of 1.3 to 2.5 cm from the evaporation source. Measured collection efficiencies were 13 to 51 μg cm -2 mg -1 and homogeneity tests showed values close to the theoretically calculated ones for a point source. Targets, selfsupporting or on backings, could be fabricated with this method for elements and some compounds with evaporation temperatures up to 2300 K. (orig.)

  13. Note on integrability of certain homogeneous Hamiltonian systems

    Energy Technology Data Exchange (ETDEWEB)

    Szumiński, Wojciech [Institute of Physics, University of Zielona Góra, Licealna 9, PL-65-407, Zielona Góra (Poland); Maciejewski, Andrzej J. [Institute of Astronomy, University of Zielona Góra, Licealna 9, PL-65-407, Zielona Góra (Poland); Przybylska, Maria, E-mail: M.Przybylska@if.uz.zgora.pl [Institute of Physics, University of Zielona Góra, Licealna 9, PL-65-407, Zielona Góra (Poland)

    2015-12-04

    In this paper we investigate a class of natural Hamiltonian systems with two degrees of freedom. The kinetic energy depends on coordinates but the system is homogeneous. Thanks to this property it admits, in a general case, a particular solution. Using this solution we derive necessary conditions for the integrability of such systems investigating differential Galois group of variational equations. - Highlights: • Necessary integrability conditions for some 2D homogeneous Hamilton systems are given. • Conditions are obtained analysing differential Galois group of variational equations. • New integrable and superintegrable systems are identified.

  14. Homogeneity and Strength of Mortar Joints in Pearl-Chain Bridges

    DEFF Research Database (Denmark)

    Lund, Mia Schou Møller; Arvidsson, Michael; Hansen, Kurt Kielsgaard

    2015-01-01

    -to-mix mortar products are tested. To the authors’ knowledge, no previous published work has documented the homogeneity and properties of mortar joints of such a height. Hence, the present study documents a practical test procedure where the homogeneity of three mortar joints measuring 20 x 220 x 2400 mm has...

  15. Traffic planning for non-homogeneous traffic

    Indian Academy of Sciences (India)

    Western traffic planning methodologies mostly address the concerns of homogeneous traffic and therefore often prove inadequate in solving problems involving ... Transportation Research and Injury Prevention Programme, Indian Institute of Technology, Hauz Khas, New Delhi 110 016; Civil and Architectural Engineering ...

  16. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  17. Smooth homogeneous structures in operator theory

    CERN Document Server

    Beltita, Daniel

    2005-01-01

    Geometric ideas and techniques play an important role in operator theory and the theory of operator algebras. Smooth Homogeneous Structures in Operator Theory builds the background needed to understand this circle of ideas and reports on recent developments in this fruitful field of research. Requiring only a moderate familiarity with functional analysis and general topology, the author begins with an introduction to infinite dimensional Lie theory with emphasis on the relationship between Lie groups and Lie algebras. A detailed examination of smooth homogeneous spaces follows. This study is illustrated by familiar examples from operator theory and develops methods that allow endowing such spaces with structures of complex manifolds. The final section of the book explores equivariant monotone operators and Kähler structures. It examines certain symmetry properties of abstract reproducing kernels and arrives at a very general version of the construction of restricted Grassmann manifolds from the theory of loo...

  18. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  19. Isotopic homogeneity of iron in the early solar nebula.

    Science.gov (United States)

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  20. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    Science.gov (United States)

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  1. Radiation Resistance and Gain of Homogeneous Ring Quasi-Array

    DEFF Research Database (Denmark)

    Knudsen, H. L.

    1954-01-01

    In a previous paper homogeneous ring quasi-arrays of tangential or radial dipoles were introduced, i.e. systems of dipoles arranged equidistantly along a circle, the dipoles being oriented in tangential or radial directions and carrying currents with the same amplitude, but with a phase that incr......In a previous paper homogeneous ring quasi-arrays of tangential or radial dipoles were introduced, i.e. systems of dipoles arranged equidistantly along a circle, the dipoles being oriented in tangential or radial directions and carrying currents with the same amplitude, but with a phase...... that increases uniformly along the circle. Such quasi-arrays are azimuthally omnidirectional, and the radiated field will be mainly horizontally polarized and concentrated around the plane of the circle. In this paper expressions are obtained for the radiation resistance and the gain of homogeneous ring quasi...

  2. Homogeneous versus heterogeneous shielding modeling of spent-fuel casks

    International Nuclear Information System (INIS)

    Carbajo, J.J.; Lindner, C.N.

    1992-01-01

    The design of spent-fuel casks for storage and transport requires modeling the cask for criticality, shielding, thermal, and structural analyses. While some parts of the cask are homogeneous, other regions are heterogeneous with different materials intermixed. For simplicity, some of the heterogeneous regions may be modeled as homogeneous. This paper evaluates the effect of homogenizing some regions of a cask on calculating radiation dose rates outside the cask. The dose rate calculations were performed with the one-dimensional discrete ordinates shielding XSDRNPM code coupled with the XSDOSE code and with the three-dimensional QAD-CGGP code. Dose rates were calculated radially at the midplane of the cask at two locations, cask surface and 2.3 m from the radial surface. The last location corresponds to a point 2 m from the lateral sides of a transport railroad car

  3. SuPer-Homogenization (SPH) Corrected Cross Section Generation for High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Ramazan Sonat [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hummel, Andrew John [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hiruta, Hikaru [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-03-01

    The deterministic full core simulators require homogenized group constants covering the operating and transient conditions over the entire lifetime. Traditionally, the homogenized group constants are generated using lattice physics code over an assembly or block in the case of prismatic high temperature reactors (HTR). For the case of strong absorbers that causes strong local depressions on the flux profile require special techniques during homogenization over a large volume. Fuel blocks with burnable poisons or control rod blocks are example of such cases. Over past several decades, there have been a tremendous number of studies performed for improving the accuracy of full-core calculations through the homogenization procedure. However, those studies were mostly performed for light water reactor (LWR) analyses, thus, may not be directly applicable to advanced thermal reactors such as HTRs. This report presents the application of SuPer-Homogenization correction method to a hypothetical HTR core.

  4. Salty popcorn in a homogeneous low-dimensional toy model of holographic QCD

    International Nuclear Information System (INIS)

    Elliot-Ripley, Matthew

    2017-01-01

    Recently, a homogeneous ansatz has been used to study cold dense nuclear matter in the Sakai–Sugimoto model of holographic QCD. To justify this homogeneous approximation we here investigate a homogeneous ansatz within a low-dimensional toy version of Sakai–Sugimoto to study finite baryon density configurations and compare it to full numerical solutions. We find the ansatz corresponds to enforcing a dyon salt arrangement in which the soliton solutions are split into half-soliton layers. Within this ansatz we find analogues of the proposed baryonic popcorn transitions, in which solutions split into multiple layers in the holographic direction. The homogeneous results are found to qualitatively match the full numerical solutions, lending confidence to the homogeneous approximations of the full Sakai–Sugimoto model. In addition, we find exact compact solutions in the high density, flat space limit which demonstrate the existence of further popcorn transitions to three layers and beyond. (paper)

  5. Homogeneity in Social Groups of Iraqis

    NARCIS (Netherlands)

    Gresham, J.; Saleh, F.; Majid, S.

    With appreciation to the Royal Institute for Inter-Faith Studies for initiating the Second World Congress for Middle Eastern Studies, this paper summarizes findings on homogeneity in community-level social groups derived from inter-ethnic research conducted during 2005 among Iraqi Arabs and Kurds

  6. Homogenization Pressure and Temperature Affect Protein Partitioning and Oxidative Stability of Emulsions

    DEFF Research Database (Denmark)

    Horn, Anna Frisenfeldt; Barouh, Nathalie; Nielsen, Nina Skall

    2013-01-01

    The oxidative stability of 10 % fish oil-in-water emulsions was investigated for emulsions prepared under different homogenization conditions. Homogenization was conducted at two different pressures (5 or 22.5 MPa), and at two different temperatures (22 and 72 °C). Milk proteins were used...... prior to homogenization did not have any clear effect on lipid oxidation in either of the two types of emulsions....

  7. HOMOGENEOUS NUCLEAR POWER REACTOR

    Science.gov (United States)

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  8. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    Energy Technology Data Exchange (ETDEWEB)

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  9. Does prescribed burning result in biotic homogenization of coastal heathlands?

    Science.gov (United States)

    Velle, Liv Guri; Nilsen, Liv Sigrid; Norderhaug, Ann; Vandvik, Vigdis

    2014-05-01

    Biotic homogenization due to replacement of native biodiversity by widespread generalist species has been demonstrated in a number of ecosystems and taxonomic groups worldwide, causing growing conservation concern. Human disturbance is a key driver of biotic homogenization, suggesting potential conservation challenges in seminatural ecosystems, where anthropogenic disturbances such as grazing and burning are necessary for maintaining ecological dynamics and functioning. We test whether prescribed burning results in biotic homogenization in the coastal heathlands of north-western Europe, a seminatural landscape where extensive grazing and burning has constituted the traditional land-use practice over the past 6000 years. We compare the beta-diversity before and after fire at three ecological scales: within local vegetation patches, between wet and dry heathland patches within landscapes, and along a 470 km bioclimatic gradient. Within local patches, we found no evidence of homogenization after fire; species richness increased, and the species that entered the burnt Calluna stands were not widespread specialists but native grasses and herbs characteristic of the heathland system. At the landscapes scale, we saw a weak homogenization as wet and dry heathland patches become more compositionally similar after fire. This was because of a decrease in habitat-specific species unique to either wet or dry habitats and postfire colonization by a set of heathland specialists that established in both habitat types. Along the bioclimatic gradient, species that increased after fire generally had more specific environmental requirements and narrower geographical distributions than the prefire flora, resulting in a biotic 'heterogenisation' after fire. Our study demonstrates that human disturbance does not necessarily cause biotic homogenization, but that continuation of traditional land-use practices can instead be crucial for the maintenance of the diversity and ecological

  10. Methods study of homogeneity and stability test from cerium oxide CRM candidate

    International Nuclear Information System (INIS)

    Samin; Susanna TS

    2016-01-01

    The methods study of homogeneity and stability test from cerium oxide CRM candidate has been studied based on ISO 13258 and KAN DP. 01. 34. The purpose of this study was to select the test method homogeneity and stability tough on making CRM cerium oxide. Prepared 10 sub samples of cerium oxide randomly selected types of analytes which represent two compounds, namely CeO_2 and La_2O_3. At 10 sub sample is analyzed CeO_2 and La_2O_3 contents in duplicate with the same analytical methods, by the same analyst, and in the same laboratory. Data analysis results calculated statistically based on ISO 13528 and KAN DP.01.34. According to ISO 13528 Cerium Oxide samples said to be homogeneous if Ss ≤ 0.3 σ and is stable if | Xr – Yr | ≤ 0.3 σ. In this study, the data of homogeneity test obtained CeO_2 is Ss = 2.073 x 10-4 smaller than 0.3 σ (0.5476) and the stability test obtained | Xr - Yr | = 0.225 and the price is < 0.3 σ. Whereas for La_2O_3, the price for homogeneity test obtained Ss = 1.649 x 10-4 smaller than 0.3 σ (0.4865) and test the stability of the price obtained | Xr - Yr | = 0.2185 where the price is < 0.3 σ. Compared with the method from KAN, a sample of cerium oxide has also been homogenized for Fcalc < Ftable and stable, because | Xi - Xhm | < 0.3 x n IQR. Provided that the results of the evaluation homogeneity and stability test from CeO_2 CRM candidate test data were processed using statistical methods ISO 13528 is not significantly different with statistical methods from KAN DP.01.34, which together meet the requirements of a homogeneous and stable. So the test method homogeneity and stability test based on ISO 13528 can be used to make CRM cerium oxide. (author)

  11. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    2015-01-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed. This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.

  12. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    Science.gov (United States)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  13. Homogenization for rigid suspensions with random velocity-dependent interfacial forces

    KAUST Repository

    Gorb, Yuliya

    2014-12-01

    We study suspensions of solid particles in a viscous incompressible fluid in the presence of random velocity-dependent interfacial forces. The flow at a small Reynolds number is modeled by the Stokes equations, coupled with the motion of rigid particles arranged in a periodic array. The objective is to perform homogenization for the given suspension and obtain an equivalent description of a homogeneous (effective) medium, the macroscopic effect of the interfacial forces and the effective viscosity are determined using the analysis on a periodicity cell. In particular, the solutions uωε to a family of problems corresponding to the size of microstructure ε and describing suspensions of rigid particles with random surface forces imposed on the interface, converge H1-weakly as ε→0 a.s. to a solution of a Stokes homogenized problem, with velocity dependent body forces. A corrector to a homogenized solution that yields a strong H1-convergence is also determined. The main technical construction is built upon the Γ-convergence theory. © 2014 Elsevier Inc.

  14. Design of SC solenoid with high homogeneity

    International Nuclear Information System (INIS)

    Yang Xiaoliang; Liu Zhong; Luo Min; Luo Guangyao; Kang Qiang; Tan Jie; Wu Wei

    2014-01-01

    A novel kind of SC (superconducting) solenoid coil is designed to satisfy the homogeneity requirement of the magnetic field. In this paper, we first calculate the current density distribution of the solenoid coil section through the linear programming method. Then a traditional solenoid and a nonrectangular section solenoid are designed to produce a central field up to 7 T with a homogeneity to the greatest extent. After comparison of the two solenoid coils designed in magnet field quality, fabrication cost and other aspects, the new design of the nonrectangular section of a solenoid coil can be realized through improving the techniques of framework fabrication and winding. Finally, the outlook and error analysis of this kind of SC magnet coil are also discussed briefly. (authors)

  15. Shape optimization in biomimetics by homogenization modelling

    International Nuclear Information System (INIS)

    Hoppe, Ronald H.W.; Petrova, Svetozara I.

    2003-08-01

    Optimal shape design of microstructured materials has recently attracted a great deal of attention in material science. The shape and the topology of the microstructure have a significant impact on the macroscopic properties. The present work is devoted to the shape optimization of new biomorphic microcellular ceramics produced from natural wood by biotemplating. We are interested in finding the best material-and-shape combination in order to achieve the optimal prespecified performance of the composite material. The computation of the effective material properties is carried out using the homogenization method. Adaptive mesh-refinement technique based on the computation of recovered stresses is applied in the microstructure to find the homogenized elasticity coefficients. Numerical results show the reliability of the implemented a posteriori error estimator. (author)

  16. Hydrogen storage materials and method of making by dry homogenation

    Science.gov (United States)

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  17. Micro-homogeneity evaluation of a bovine kidney candidate reference material

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Liliana; Moreira, Edson G.; Vasconcellos, Marina B.A., E-mail: lcastroesnal@usp.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The minimum sample intake for which a reference material remains homogeneous is one of the parameters that must be estimated in the homogeneity assessment study of reference materials. In this work, Instrumental Neutron Activation Analysis was used to evaluate this quantity in a bovine kidney candidate reference material. The mass fractions of 9 inorganic constituents were determined in subsamples between 1 and 2 mg in order to estimate the relative homogeneity factor (HE) and the minimum sample mass to achieve 5% and 10% precision on a 95% confidence level. Results obtained for H{sub E} in all the analyzed elements were satisfactory. The estimated minimum sample intake was between 2 mg and 40 mg, depending on the element. (author)

  18. On the time-homogeneous Ornstein–Uhlenbeck process in the foreign exchange rates

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Regina C.B. da, E-mail: regina@quimica-industrial.com [Department of Mathematics, Instituto Federal de Goiás, Goiânia, Goiás 74055-110 (Brazil); International Center for Condensed Matter Physics, Instituto de Física, Universidade de Brasília, Caixa Postal 04455, 70919-970, Brasília, Distrito Federal (Brazil); Matsushita, Raul Y. [Department of Statistics, Universidade de Brasília, 70919-970, Brasília, Distrito Federal (Brazil); Castro, Márcio T. de; Figueiredo, Annibal [International Center for Condensed Matter Physics, Instituto de Física, Universidade de Brasília, Caixa Postal 04455, 70919-970, Brasília, Distrito Federal (Brazil)

    2015-10-02

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein–Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein–Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions. - Highlights: • Gaussianity and stationarity assumptions replaced by linearity and time-homogeneity. • We revisit the time-homogeneous Ornstein–Uhlenbeck (THOU) process. • We employ the THOU process to analyze foreign exchange rates against the US dollar. • The first four cumulants patterns from data can be described by the THOU process.

  19. On the time-homogeneous Ornstein–Uhlenbeck process in the foreign exchange rates

    International Nuclear Information System (INIS)

    Fonseca, Regina C.B. da; Matsushita, Raul Y.; Castro, Márcio T. de; Figueiredo, Annibal

    2015-01-01

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein–Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein–Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions. - Highlights: • Gaussianity and stationarity assumptions replaced by linearity and time-homogeneity. • We revisit the time-homogeneous Ornstein–Uhlenbeck (THOU) process. • We employ the THOU process to analyze foreign exchange rates against the US dollar. • The first four cumulants patterns from data can be described by the THOU process

  20. Back to basics: homogeneous representations of multi-rate synchronous dataflow graphs

    NARCIS (Netherlands)

    de Groote, Robert; Holzenspies, P.K.F.; Kuper, Jan; Broersma, Haitze J.

    2013-01-01

    Exact temporal analyses of multi-rate synchronous dataflow (MRSDF) graphs, such as computing the maximum achievable throughput, or sufficient buffer sizes required to reach a minimum throughput, require a homogeneous representation called a homogeneous synchronous dataflow (HSDF) graph. The size of

  1. Online screening of homogeneous catalyst performance using reaction detection mass spectrometry

    NARCIS (Netherlands)

    Martha, C.T.; Elders, N.; Krabbe, J.G.; Kool, J.; Niessen, W.M.A.; Orru, R.V.A.; Irth, H.

    2008-01-01

    An integrated online screening system was developed to rapidly screen homogeneous catalysts for activity toward a selected synthesis. The continuous-flow system comprises standard HPLC pumps for the delivery of substrates, an HPLC autosampler for the injection of homogeneous catalysts, a

  2. Pi overlapping ring systems contained in a homogeneous assay: a novel homogeneous assay for antigens

    Science.gov (United States)

    Kidwell, David A.

    1993-05-01

    A novel immunoassay, Pi overlapping ring systems contained in a homogeneous assay (PORSCHA), is described. This assay relies upon the change in fluorescent spectral properties that pyrene and its derivatives show with varying concentration. Because antibodies and other biomolecules can bind two molecules simultaneously, they can change the local concentration of the molecules that they bind. This concentration change may be detected spectrally as a change in the fluorescence emission wavelength of an appropriately labeled biomolecule. Several tests of PORSCHA have been performed which demonstrate this principle. For example: with streptavidin as the binding biomolecule and a biotin labeled pyrene derivative, the production of the excimer emitting at 470 nm is observed. Without the streptavidin present, only the monomer emitting at 378 and 390 nm is observed. The ratio of monomer to excimer provides the concentration of unlabeled biotin in the sample. Approximately 1 ng/mL of biotin may be detected with this system using a 50 (mu) l sample (2 X 10-16 moles biotin). The principles behind PORSCHA, the results with the streptavidin/biotin system are discussed and extensions of the PORSCHA concept to antibodies as the binding partner and DNA in homogeneous assays are suggested.

  3. Cell homogenization methods for pin-by-pin core calculations tested in slab geometry

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Kitamura, Yasunori; Yamane, Yoshihiro

    2004-01-01

    In this paper, performances of spatial homogenization methods for fuel or non-fuel cells are compared in slab geometry in order to facilitate pin-by-pin core calculations. Since the spatial homogenization methods were mainly developed for fuel assemblies, systematic study of their performance for the cell-level homogenization has not been carried out. Importance of cell-level homogenization is recently increasing since the pin-by-pin mesh core calculation in actual three-dimensional geometry, which is less approximate approach than current advanced nodal method, is getting feasible. Four homogenization methods were investigated in this paper; the flux-volume weighting, the generalized equivalence theory, the superhomogenization (SPH) method and the nonlinear iteration method. The last one, the nonlinear iteration method, was tested as the homogenization method for the first time. The calculations were carried out in simplified colorset assembly configurations of PWR, which are simulated by slab geometries, and homogenization performances were evaluated through comparison with the reference cell-heterogeneous calculations. The calculation results revealed that the generalized equivalence theory showed best performance. Though the nonlinear iteration method can significantly reduce homogenization error, its performance was not as good as that of the generalized equivalence theory. Through comparison of the results obtained by the generalized equivalence theory and the superhomogenization method, important byproduct was obtained; deficiency of the current superhomogenization method, which could be improved by incorporating the 'cell-level discontinuity factor between assemblies', was clarified

  4. Biotic homogenization of three insect groups due to urbanization.

    Science.gov (United States)

    Knop, Eva

    2016-01-01

    Cities are growing rapidly, thereby expected to cause a large-scale global biotic homogenization. Evidence for the homogenization hypothesis is mostly derived from plants and birds, whereas arthropods have so far been neglected. Here, I tested the homogenization hypothesis with three insect indicator groups, namely true bugs, leafhoppers, and beetles. In particular, I was interested whether insect species community composition differs between urban and rural areas, whether they are more similar between cities than between rural areas, and whether the found pattern is explained by true species turnover, species diversity gradients and geographic distance, by non-native or specialist species, respectively. I analyzed insect species communities sampled on birch trees in a total of six Swiss cities and six rural areas nearby. In all indicator groups, urban and rural community composition was significantly dissimilar due to native species turnover. Further, for bug and leafhopper communities, I found evidence for large-scale homogenization due to urbanization, which was driven by reduced species turnover of specialist species in cities. Species turnover of beetle communities was similar between cities and rural areas. Interestingly, when specialist species of beetles were excluded from the analyses, cities were more dissimilar than rural areas, suggesting biotic differentiation of beetle communities in cities. Non-native species did not affect species turnover of the insect groups. However, given non-native arthropod species are increasing rapidly, their homogenizing effect might be detected more often in future. Overall, the results show that urbanization has a negative large-scale impact on the diversity specialist species of the investigated insect groups. Specific measures in cities targeted at increasing the persistence of specialist species typical for the respective biogeographic region could help to stop the loss of biodiversity. © 2015 John Wiley & Sons Ltd.

  5. Numerical simulation of homogenization time measurement by probes with different volume size

    International Nuclear Information System (INIS)

    Thyn, J.; Novy, M.; Zitny, R.; Mostek, M.; Jahoda, M.

    2004-01-01

    Results of continuous homogenization time measurement of liquid in a stirred tank depend on the scale of scrutiny. Experimental techniques use a probe, which is situated inside as a conductivity method, or outside of the tank as in the case of gamma-radiotracer methods. Expected value of homogenization time evaluated for a given degree of homogenization is higher when using the conductivity method because the conductivity probe measures relatively small volume in contrast to application of radiotracer, when the volume is much greater. Measurement through the wall of tank is a great advantage of radiotracer application but a comparison of the results with another method supposes a determination of measured volume, which is not easy. Simulation of measurement by CFD code can help to solve the problem. Methodology for CFD simulation of radiotracer experiments was suggested. Commercial software was used for simulation of liquid homogenization in mixed vessel with Rushton turbine. Numerical simulation of liquid homogenization time by CFD for different values of detected volume was confronted with measurement of homogenization time with conductivity probe and with different radioisotopes 198 Au, 82 Br and 24 Na. Detected size of the tank volume was affected by different energy of radioisotope used. (author)

  6. Dark energy homogeneity in general relativity: Are we applying it correctly?

    Science.gov (United States)

    Duniya, Didam G. A.

    2016-04-01

    Thus far, there does not appear to be an agreed (or adequate) definition of homogeneous dark energy (DE). This paper seeks to define a valid, adequate homogeneity condition for DE. Firstly, it is shown that as long as w_x ≠ -1, DE must have perturbations. It is then argued, independent of w_x, that a correct definition of homogeneous DE is one whose density perturbation vanishes in comoving gauge: and hence, in the DE rest frame. Using phenomenological DE, the consequence of this approach is then investigated in the observed galaxy power spectrum—with the power spectrum being normalized on small scales, at the present epoch z=0. It is found that for high magnification bias, relativistic corrections in the galaxy power spectrum are able to distinguish the concordance model from both a homogeneous DE and a clustering DE—on super-horizon scales.

  7. Theoretical and numerical simulation of the saturation of the stimulated Raman scattering instability that occurs in laser-plasma interaction; Modelisation theorique et numerique de la saturation de l'instabilite de diffusion Raman stimulee se developpant dans l'interaction laser-plasma

    Energy Technology Data Exchange (ETDEWEB)

    Fouquet, T

    2007-01-15

    In this work we present 2 important results. First, for a relatively moderate laser lighting (I*{lambda}{sup 2} {approx_equal} 10{sup 14} W{mu}m{sup 2}/cm{sup 2}), cavitation appears in Langmuir decay instability (LDI) whenever the plasma wavelength is above a certain limit. Secondly, in the case of an inhomogeneous plasma there is an increase of the Raman reflectivity in presence of LDI for a plasma density profile that was initially smooth. This work is divided into 5 chapters. The first chapter is dedicated to parametric instabilities especially Raman instability and Langmuir decay instability. The equations that govern these instabilities as well as their numerical solutions are presented in the second chapter. The third chapter deals with the case of a mono-dimensional plasma with homogenous density. The saturation of the Raman instability in a mono-dimensional plasma with inhomogeneous density is studied in the fourth chapter. The last chapter is dedicated to bi-dimensional simulations for various types of laser beams.

  8. A new consistent definition of the homogenized diffusion coefficient of a lattice, limitations of the homogenization concept, and discussion of previously defined coefficients

    International Nuclear Information System (INIS)

    Deniz, V.C.

    1980-01-01

    The problem concerned with the correct definition of the homogenized diffusion coefficient of a lattice, and the concurrent problem of whether or not a homogenized diffusion equation can be formally set up, is studied by a space-energy-angle dependent treatment for a general lattice cell using an operator notation which applies to any eigen-problem. A new definition of the diffusion coefficient is given, which combines within itself the individual merits of the two definitions of Benoist. The relation between the new coefficient and the ''uncorrected'' Benoist coefficient is discussed by considering continuous-spectrum and multi-group diffusion equations. Other definitions existing in the literature are briefly discussed. It is concluded that a diffusion coefficient should represent only leakage effects. A comparison is made between the homogenization approach and the approach via eigen-coefficients, and brief indications are given of a possible scheme for the latter. (author)

  9. Conclusions about homogeneity and devitrification

    International Nuclear Information System (INIS)

    Larche, F.

    1997-01-01

    A lot of experimental data concerning homogeneity and devitrification of R7T7 glass have been published. It appears that: - the crystallization process is very limited, - the interfaces due to bubbles and the container wall favor crystallization locally but the ratio of crystallized volume remains always below a few per cents, and - crystallization has no damaging long-term effects as far as leaching tests can be trusted. (A.C.)

  10. The combinational structure of non-homogeneous Markov chains with countable states

    Directory of Open Access Journals (Sweden)

    A. Mukherjea

    1983-01-01

    Full Text Available Let P(s,t denote a non-homogeneous continuous parameter Markov chain with countable state space E and parameter space [a,b], −∞0}. It is shown in this paper that R(s,t is reflexive, transitive, and independent of (s,t, shomogeneity condition holds. It is also shown that the relation R(s,t, unlike in the finite state space case, cannot be expressed even as an infinite (countable product of reflexive transitive relations for certain non-homogeneous chains in the case when E is infinite.

  11. Effect of dynamic high pressure homogenization on the aggregation state of soy protein.

    Science.gov (United States)

    Keerati-U-Rai, Maneephan; Corredig, Milena

    2009-05-13

    Although soy proteins are often employed as functional ingredients in oil-water emulsions, very little is known about the aggregation state of the proteins in solution and whether any changes occur to soy protein dispersions during homogenization. The effect of dynamic high pressure homogenization on the aggregation state of the proteins was investigated using microdifferential scanning calorimetry and high performance size exclusion chromatography coupled with multiangle laser light scattering. Soy protein isolates as well as glycinin and beta-conglycinin fractions were prepared from defatted soy flakes and redispersed in 50 mM sodium phosphate buffer at pH 7.4. The dispersions were then subjected to homogenization at two different pressures, 26 and 65 MPa. The results demonstrated that dynamic high pressure homogenization causes changes in the supramolecular structure of the soy proteins. Both beta-conglycinin and glycinin samples had an increased temperature of denaturation after homogenization. The chromatographic elution profile showed a reduction in the aggregate concentration with homogenization pressure for beta-conglycinin and an increase in the size of the soluble aggregates for glycinin and soy protein isolate.

  12. Non-homogeneous flow profiles in sheared bacterial suspensions

    Science.gov (United States)

    Samanta, Devranjan; Cheng, Xiang

    Bacterial suspensions under shear exhibit interesting rheological behaviors including the remarkable ``superfluidic'' state with vanishing viscosity at low shear rates. Theoretical studies have shown that such ``superfluidic'' state is linked with non-homogeneous shear flows, which are induced by coupling between nematic order of active fluids and hydrodynamics of shear flows. However, although bulk rheology of bacterial suspensions has been experimentally studied, shear profiles within bacterial suspensions have not been explored so far. Here, we experimentally investigate the flow behaviors of E. coli suspensions under planar oscillatory shear. Using confocal microscopy and PIV, we measure velocity profiles across gap between two shear plates. We find that with increasing shear rates, high-concentration bacterial suspensions exhibit an array of non-homogeneous flow behaviors like yield-stress flows and shear banding. We show that these non-homogeneous flows are due to collective motion of bacterial suspensions. The phase diagram of sheared bacterial suspensions is systematically mapped as functions of shear rates an bacterial concentrations. Our experiments provide new insights into rheology of bacterial suspensions and shed light on shear induced dynamics of active fluids. Chemical Engineering and Material Science department.

  13. Eulerian numerical simulation of gas-solid flows with several particles species; Modelisation numerique eulerienne des ecoulements gaz-solide avec plusieurs especes de particules

    Energy Technology Data Exchange (ETDEWEB)

    Patino-Palacios, G

    2007-11-15

    The simulation of the multiphase flows is currently an important scientific, industrial and economic challenge. The objective of this work is to improve comprehension via simulations of poly-dispersed flows and contribute the modeling and characterizing of its hydrodynamics. The study of gas-solid systems involves the models that takes account the influence of the particles and the effects of the collisions in the context of the momentum transfer. This kind of study is covered on the framework of this thesis. Simulations achieved with the Saturne-polyphasique-Tlse code, developed by Electricite de France and co-worked with the Institut de Mecanique des Fluides de Toulouse, allowed to confirm the feasibility of approach CFD for the hydrodynamic study of the injectors and dense fluidized beds. The stages of validation concern, on the one hand, the placement of the tool for simulation in its current state to make studies of validation and sensitivity of the models and to compare the numerical results with the experimental data. In addition, the development of new physical models and their establishments in the code Saturne will allow the optimization of the industrial process. To carry out this validation in a satisfactory way, a key simulation is made, in particular a monodisperse injection and the radial force of injection in the case of a poly-disperse flow, as well as the fluidization of a column made up of solid particles. In this last case, one approached three configurations of dense fluidized beds, in order to study the influence of the grid on simulations; then, one simulates the operation of a dense fluidized bed with which one characterizes the segregation between two various species of particles. The study of the injection of the poly-disperse flows presents two configurations; a flow Co-current gas-particle in gas (Case Hishida), and in addition, a poly-phase flow in a configuration of the jet type confined with zones of recirculation and stagnation (case Hercules). Numerical calculations were compared with the experimental data available and showed a satisfactory reproducibility of the hydrodynamic prediction of the multi-phasic flows. (author)

  14. Selecting for extinction: nonrandom disease-associated extinction homogenizes amphibian biotas.

    Science.gov (United States)

    Smith, Kevin G; Lips, Karen R; Chase, Jonathan M

    2009-10-01

    Studying the patterns in which local extinctions occur is critical to understanding how extinctions affect biodiversity at local, regional and global spatial scales. To understand the importance of patterns of extinction at a regional spatial scale, we use data from extirpations associated with a widespread pathogenic agent of amphibian decline, Batrachochytrium dendrobatidis (Bd) as a model system. We apply novel null model analyses to these data to determine whether recent extirpations associated with Bd have resulted in selective extinction and homogenization of diverse tropical American amphibian biotas. We find that Bd-associated extinctions in this region were nonrandom and disproportionately, but not exclusively, affected low-occupancy and endemic species, resulting in homogenization of the remnant amphibian fauna. The pattern of extirpations also resulted in phylogenetic homogenization at the family level and ecological homogenization of reproductive mode and habitat association. Additionally, many more species were extirpated from the region than would be expected if extirpations occurred randomly. Our results indicate that amphibian declines in this region are an extinction filter, reducing regional amphibian biodiversity to highly similar relict assemblages and ultimately causing amplified biodiversity loss at regional and global scales.

  15. Converting homogeneous to heterogeneous in electrophilic catalysis using monodisperse metal nanoparticles.

    Science.gov (United States)

    Witham, Cole A; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N; Somorjai, Gabor A; Toste, F Dean

    2010-01-01

    A continuing goal in catalysis is to unite the advantages of homogeneous and heterogeneous catalytic processes. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this unification can also be supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl(2), and catalyse a range of π-bond activation reactions previously only catalysed through homogeneous processes. Multiple experimental methods are used to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, a size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared with larger, polymer-capped analogues.

  16. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  17. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    Science.gov (United States)

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Accounting for Fiber Bending Effects in Homogenization of Long Fiber Reinforced Composites

    DEFF Research Database (Denmark)

    Poulios, Konstantinos; Niordson, Christian Frithiof

    2015-01-01

    The present work deals with homogenized finite-element models of long fiber reinforced composite materials in the context of studying compressive failure modes such as the formation of kink bands and fiber micro-buckling. Compared to finite-element models with an explicit discretization of the ma......The present work deals with homogenized finite-element models of long fiber reinforced composite materials in the context of studying compressive failure modes such as the formation of kink bands and fiber micro-buckling. Compared to finite-element models with an explicit discretization...... of the material micro-structure including individual fibers, homogenized models are computationally more efficient and hence more suitable for modeling of larger and complex structure. Nevertheless, the formulation of homogenized models is more complicated, especially if the bending stiffness of the reinforcing...... fibers is to be taken into account. In that case, so-called higher order strain terms need to be considered. In this paper, important relevant works from the literature are discussed and numerical results from a new homogenization model are presented. The new model accounts for two independent...

  19. Dosimetric calculations by Monte Carlo for treatments of radiosurgery with the Leksell Gamma Knife, homogeneous and non homogeneous cases

    International Nuclear Information System (INIS)

    Rojas C, E.L.; Lallena R, A.M.

    2004-01-01

    In this work dose profiles are calculated that are obtained modeling treatments of radiosurgery with the Leksell Gamma Knife. This was made with the simulation code Monte Carlo Penelope for an homogeneous mannequin and one not homogeneous. Its were carried out calculations with the irradiation focus coinciding with the center of the mannequin as in near areas to the bone interface. Each one of the calculations one carries out for the 4 skull treatment that it includes the Gamma Knife and using a model simplified of their 201 sources of 60 Co. It was found that the dose profiles differ of the order of 2% when the isocenter coincides with the center of the mannequin and they ascend to near 5% when the isocenter moves toward the skull. (Author)

  20. Homogenization scheme for acoustic metamaterials

    KAUST Repository

    Yang, Min

    2014-02-26

    We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.

  1. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    Science.gov (United States)

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  2. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    Directory of Open Access Journals (Sweden)

    Liang Tang

    Full Text Available Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  3. Homogeneous forming technology of composite materials and its application to dispersion nuclear fuel

    International Nuclear Information System (INIS)

    Hong, Soon Hyun; Ryu, Ho Jin; Sohn, Woong Hee; Kim, Chang Kyu

    1997-01-01

    Powder metallurgy processing technique of metal matrix composites is reviewed and its application to process homogeneous dispersion nuclear fuel is considered. The homogeneous mixing of reinforcement with matrix powders is very important step to process metal matrix composites. The reinforcement with matrix powders is very important step to process metal matrix composites. The reinforcement can be ceramic particles, whiskers or chopped fibers having high strength and high modulus. The blended powders are consolidated into billets and followed by various deformation processing, such as extrusion, forging, rolling or spinning into final usable shapes. Dispersion nuclear fuel is a class of metal matrix composite consisted of dispersed U-compound fuel particles and metallic matrix. Dispersion nuclear fuel is fabricated by powder metallurgy process such as hot pressing followed by hot extrusion, which is similar to that of SiC/Al metal matrix composite. The fabrication of homogeneous dispersion nuclear fuel is very difficult mainly due to the inhomogeneous mixing characteristics of the powders from quite different densities between uranium alloy powders and aluminum powders. In order to develop homogeneous dispersion nuclear fuel, it is important to investigate the effect of powder characteristics and mixing techniques on homogeneity of dispersion nuclear fuel. An new quantitative analysis technique of homogeneity is needed to be developed for more accurate analysis of homogeneity in dispersion nuclear fuel. (author). 28 refs., 7 figs., 1tab

  4. Homogeneous nucleation limit on the bulk formation of metallic glasses

    International Nuclear Information System (INIS)

    Drehman, A.J.

    1983-01-01

    Glassy Pd 82 Si 18 spheres, of up to 1 mm diameter, were formed in a drop tube filled with He gas. The largest spheres were successfully cooled to a glass using a cooling rate of less than 800 K/sec. Even at this low cooling rate, crystallization (complete or partial) was the result of heterogeneous nucleation at a high temperature, relative to the temperature at which copious homogeneous nucleation would commence. Bulk underscoring experiments demonstrated that this alloy could be cooled to 385 K below its eutectic melting temperature (1083 K) without the occurrence of crystallization. If heterogeneous nucleation can be avoided, it is estimated that a cooling rate of at most 100 K/sec would be required to form this alloy in the glassy state. Ingots of glassy Pd 40 Ni 40 P 20 were formed from the liquid by cooling at a rate of only 1 K/sec. It was found that glassy samples of this alloy could be heated well above the glass transition temperature without the occurrence of rapid divitrification. This is a result due, in part of the low density of pre-existing nuclei, but, more importantly, due to the low homogeneous nucleation rate and the slow crystal growth kinetics. Based on the observed devitrification kinetics, the steady-state homogeneous nucleation rate is approximately 1 nuclei/cm 3 sec at 590 K (the temperature at which the homogeneous nucleation rate is estimated to be a maximum). Two iron-nickel based glass-forming alloys (Fe 40 Ni 40 P 14 B 6 and Fe 40 Ni 40 B 20 , were not successfully formed into glassy spheres, however, microstructural examination indicates that crystallization was not the result of copious homogeneous nucleation. In contrast, glass forming iron based alloys (Fe 80 B 20 and Fe/sub 79.3/B/sub 16.4/Si/sub 4.0/C/sub 0.3/) exhibit copious homogeneous nucleation when cooled at approximately the same rate

  5. Homogenization of aligned “fuzzy fiber” composites

    KAUST Repository

    Chatzigeorgiou, George; Efendiev, Yalchin; Lagoudas, Dimitris C.

    2011-01-01

    The aim of this work is to study composites in which carbon fibers coated with radially aligned carbon nanotubes are embedded in a matrix. The effective properties of these composites are identified using the asymptotic expansion homogenization

  6. Homogenization of neutronic diffusion models; Homogeneisation des modeles de diffusion en neutronique

    Energy Technology Data Exchange (ETDEWEB)

    Capdebosq, Y

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  7. Assessment the effect of homogenized soil on soil hydraulic properties and soil water transport

    Science.gov (United States)

    Mohawesh, O.; Janssen, M.; Maaitah, O.; Lennartz, B.

    2017-09-01

    Soil hydraulic properties play a crucial role in simulating water flow and contaminant transport. Soil hydraulic properties are commonly measured using homogenized soil samples. However, soil structure has a significant effect on the soil ability to retain and to conduct water, particularly in aggregated soils. In order to determine the effect of soil homogenization on soil hydraulic properties and soil water transport, undisturbed soil samples were carefully collected. Five different soil structures were identified: Angular-blocky, Crumble, Angular-blocky (different soil texture), Granular, and subangular-blocky. The soil hydraulic properties were determined for undisturbed and homogenized soil samples for each soil structure. The soil hydraulic properties were used to model soil water transport using HYDRUS-1D.The homogenized soil samples showed a significant increase in wide pores (wCP) and a decrease in narrow pores (nCP). The wCP increased by 95.6, 141.2, 391.6, 3.9, 261.3%, and nCP decreased by 69.5, 10.5, 33.8, 72.7, and 39.3% for homogenized soil samples compared to undisturbed soil samples. The soil water retention curves exhibited a significant decrease in water holding capacity for homogenized soil samples compared with the undisturbed soil samples. The homogenized soil samples showed also a decrease in soil hydraulic conductivity. The simulated results showed that water movement and distribution were affected by soil homogenizing. Moreover, soil homogenizing affected soil hydraulic properties and soil water transport. However, field studies are being needed to find the effect of these differences on water, chemical, and pollutant transport under several scenarios.

  8. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    Science.gov (United States)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  9. Modelisation de la diffusion sur les surfaces metalliques: De l'adatome aux processus de croissance

    Science.gov (United States)

    Boisvert, Ghyslain

    Cette these est consacree a l'etude des processus de diffusion en surface dans le but ultime de comprendre, et de modeliser, la croissance d'une couche mince. L'importance de bien mai triser la croissance est primordiale compte tenu de son role dans la miniaturisation des circuits electroniques. Nous etudions ici les surface des metaux nobles et de ceux de la fin de la serie de transition. Dans un premier temps, nous nous interessons a la diffusion d'un simple adatome sur une surface metallique. Nous avons, entre autres, mis en evidence l'apparition d'une correlation entre evenements successifs lorsque la temperature est comparable a la barriere de diffusion, i.e., la diffusion ne peut pas etre associee a une marche aleatoire. Nous proposons un modele phenomenologique simple qui reproduit bien les resultats des simulations. Ces calculs nous ont aussi permis de montrer que la diffusion obeit a la loi de Meyer-Neldel. Cette loi stipule que, pour un processus active, le prefacteur augmente exponentiellement avec la barriere. En plus, ce travail permet de clarifier l'origine physique de cette loi. En comparant les resultats dynamiques aux resultats statiques, on se rend compte que la barriere extraite des calculs dynamiques est essentiellement la meme que celle obtenue par une approche statique, beaucoup plus simple. On peut donc obtenir cette barriere a l'aide de methodes plus precises, i.e., ab initio, comme la theorie de la fonctionnelle de la densite, qui sont aussi malheureusement beaucoup plus lourdes. C'est ce que nous avons fait pour plusieurs systemes metalliques. Nos resultats avec cette derniere approche se comparent tres bien aux resultats experimentaux. Nous nous sommes attardes plus longuement a la surface (111) du platine. Cette surface regorge de particularites interessantes, comme la forme d'equilibre non-hexagonale des i lots et deux sites d'adsorption differents pour l'adatome. De plus, des calculs ab initio precedents n'ont pas reussi a confirmer la

  10. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    Science.gov (United States)

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  11. Mach's principle in spatially homogeneous spacetimes

    International Nuclear Information System (INIS)

    Tipler, F.J.

    1978-01-01

    On the basis of Mach's Principle it is concluded that the only singularity-free solution to the empty space Einstein equations is flat space. It is shown that the only singularity-free solution to the empty space Einstein equations which is spatially homogeneous and globally hyperbolic is in fact suitably identified Minkowski space. (Auth.)

  12. How to determine composite material properties using numerical homogenization

    DEFF Research Database (Denmark)

    Andreassen, Erik; Andreasen, Casper Schousboe

    2014-01-01

    Numerical homogenization is an efficient way to determine effective macroscopic properties, such as the elasticity tensor, of a periodic composite material. In this paper an educational description of the method is provided based on a short, self-contained Matlab implementation. It is shown how...... the basic code, which computes the effective elasticity tensor of a two material composite, where one material could be void, is easily extended to include more materials. Furthermore, extensions to homogenization of conductivity, thermal expansion, and fluid permeability are described in detail. The unit...

  13. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    Science.gov (United States)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and

  14. The coherent state on SUq(2) homogeneous space

    International Nuclear Information System (INIS)

    Aizawa, N; Chakrabarti, R

    2009-01-01

    The generalized coherent states for quantum groups introduced by Jurco and StovIcek are studied for the simplest example SU q (2) in full detail. It is shown that the normalized SU q (2) coherent states enjoy the property of completeness, and allow a resolution of the unity. This feature is expected to play a key role in the application of these coherent states in physical models. The homogeneous space of SU q (2), i.e. the q-sphere of Podles, is reproduced in complex coordinates by using the coherent states. Differential calculus in the complex form on the homogeneous space is developed. The high spin limit of the SU q (2) coherent states is also discussed.

  15. Synthesis of silica nanosphere from homogeneous and ...

    Indian Academy of Sciences (India)

    WINTEC

    avoid it, reaction in heterogeneous system using CTABr was carried out. Nanosized silica sphere with ... Homogeneous system contains a mixture of ethanol, water, aqueous ammonia and ... heated to 823 K (rate, 1 K/min) in air and kept at this.

  16. Abelian gauge theories on homogeneous spaces

    International Nuclear Information System (INIS)

    Vassilevich, D.V.

    1992-07-01

    An algebraic technique of separation of gauge modes in Abelian gauge theories on homogeneous spaces is proposed. An effective potential for the Maxwell-Chern-Simons theory on S 3 is calculated. A generalization of the Chern-Simons action is suggested and analysed with the example of SU(3)/U(1) x U(1). (author). 11 refs

  17. The Effect of pH and High-Pressure Homogenization on Droplet Size

    Directory of Open Access Journals (Sweden)

    Ah Pis Yong

    2017-12-01

    Full Text Available The aims of this study are to revisit the effect of high pressure on homogenization and the influence of pH on the emulsion droplet sizes. The high-pressure homogenization (HPH involves two stages of processing, where the first stage involves in blending the coarse emulsion by a blender, and the second stage requires disruption of the coarse emulsion into smaller droplets by a high-pressure homogenizer. The pressure range in this review is in between 10-500 MPa. The homogenised droplet sizes can be reduced by increasing the homogenization recirculation, and there is a threshold point beyond that by applying pressure only, the size cannot be further reduced. Normally, homogenised emulsions are classified by their degree of kinetic stability. Dispersed phase present in the form of droplets while continuous phase also known as suspended droplets. With a proper homogenization recirculation and pressure, a more kinetically stable emulsion can be produced. The side effects of increasing homogenization pressure are that it can cause overprocessing of the emulsion droplets where the droplet sizes become larger rather than the expected smaller size. This can cause kinetic instability in the emulsion. The droplet size is usually measured by dynamic light scattering or by laser light scattering technique. The type of samples used in this reviews are such as chocolate and vanilla based powders; mean droplet sizes samples; basil oil; tomato; lupin protein; oil; skim milk, soymilk; coconut milk; tomato homogenate; corn; egg-yolk, rapeseed and sunflower; Poly(4-vinylpyridine/silica; and Complex 1 until complex 4 approaches from author case study. A relationship is developed between emulsion size and pH. Results clearly show that lower pH offers smaller droplet of emulsion and the opposite occurs when the pH is increased.

  18. Economical preparation of extremely homogeneous nuclear accelerator targets

    International Nuclear Information System (INIS)

    Maier, H.J.

    1983-01-01

    Techniques for target preparation with a minimum consumption of isotopic material are described. The rotating substrate method, which generates extremely homogeneous targets, is discussed in some detail

  19. Type of homogenization and fat loss during continuous infusion of human milk.

    Science.gov (United States)

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P homogenization is used. © The Author(s) 2014.

  20. High temperature homogenization improves impact toughness of vitamin E-diffused, irradiated UHMWPE.

    Science.gov (United States)

    Oral, Ebru; O'Brien, Caitlin; Doshi, Brinda; Muratoglu, Orhun K

    2017-06-01

    Diffusion of vitamin E into radiation cross-linked ultrahigh molecular weight polyethylene (UHMWPE) is used to increase stability against oxidation of total joint implant components. The dispersion of vitamin E throughout implant preforms has been optimized by a two-step process of doping and homogenization. Both of these steps are performed below the peak melting point of the cross-linked polymer (homogenization of antioxidant-doped, radiation cross-linked UHMWPE could improve its toughness. We found that homogenization at 300°C for 8 h resulted in an increase in the impact toughness (74 kJ/m 2 compared to 67 kJ/m 2 ), the ultimate tensile strength (50 MPa compared to 43 MPa) and elongation at break (271% compared to 236%). The high temperature treatment did not compromise the wear resistance or the oxidative stability as measured by oxidation induction time. In addition, the desired homogeneity was achieved at a much shorter duration (8 h compared to >240 h) by using high temperature homogenization. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:1343-1347, 2017. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  1. Collision-free gases in spatially homogeneous space-times

    International Nuclear Information System (INIS)

    Maartens, R.; Maharaj, S.D.

    1985-01-01

    The kinematical and dynamical properties of one-component collision-free gases in spatially homogeneous, locally rotationally symmetric (LRS) space-times are analyzed. Following Ray and Zimmerman [Nuovo Cimento B 42, 183 (1977)], it is assumed that the distribution function f of the gas inherits the symmetry of space-time, in order to construct solutions of Liouville's equation. The redundancy of their further assumption that f be based on Killing vector constants of the motion is shown. The Ray and Zimmerman results for Kantowski--Sachs space-time are extended to all spatially homogeneous LRS space-times. It is shown that in all these space-times the kinematic average four-velocity u/sup i/ can be tilted relative to the homogeneous hypersurfaces. This differs from the perfect fluid case, in which only one space-time admits tilted u/sup i/, as shown by King and Ellis [Commun. Math. Phys. 31, 209 (1973)]. As a consequence, it is shown that all space-times admit nonzero acceleration and heat flow, while a subclass admits nonzero vorticity. The stress π/sub i/j is proportional to the shear sigma/sub i/j by virtue of the invariance of the distribution function. The evolution of tilt and the existence of perfect fluid solutions is also discussed

  2. Frequency-dependant homogenized properties of composite using spectral analysis method

    International Nuclear Information System (INIS)

    Ben Amor, M; Ben Ghozlen, M H; Lanceleur, P

    2010-01-01

    An inverse procedure is proposed to determine the material constants of multilayered composites using a spectral analysis homogenization method. Recursive process gives interfacial displacement perpendicular to layers in term of deepness. A fast-Fourier transform (FFT) procedure has been used in order to extract the wave numbers propagating in the multilayer. The upper frequency bound of this homogenization domain is estimated. Inside the homogenization domain, we discover a maximum of three planes waves susceptible to propagate in the medium. A consistent algorithm is adopted to develop an inverse procedure for the determination of the materials constants of multidirectional composite. The extracted wave numbers are used as the inputs for the procedure. The outputs are the elastic constants of multidirectional composite. Using this method, the frequency dependent effective elastic constants are obtained and example for [0/90] composites is given.

  3. Homogeneous SLOWPOKE reactor for the production of radio-isotope. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busatta, P.; Bonin, H.W. [Royal Military College of Canada, Kingston, Ontario (Canada)]. E-mail: paul.busatta@rmc.ca; bonin-h@rmc.ca

    2006-07-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  4. Homogeneous SLOWPOKE reactor for the production of radio-isotope. A feasibility study

    International Nuclear Information System (INIS)

    Busatta, P.; Bonin, H.W.

    2006-01-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  5. Characterization of two-scale gradient Young measures and application to homogenization

    OpenAIRE

    Babadjian, Jean-Francois; Baia, Margarida; Santos, Pedro M.

    2006-01-01

    This work is devoted to the study of two-scale gradient Young measures naturally arising in nonlinear elasticity homogenization problems. Precisely, a characterization of this class of measures is derived and an integral representation formula for homogenized energies, whose integrands satisfy very weak regularity assumptions, is obtained in terms of two-scale gradient Young measures.

  6. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  7. Low-gravity homogenization and solidification of aluminum antimonide. [Apollo-Soyuz test project

    Science.gov (United States)

    Ang, C.-Y.; Lacy, L. L.

    1976-01-01

    The III-V semiconducting compound AlSb shows promise as a highly efficient solar cell material, but it has not been commercially exploited because of difficulties in compound synthesis. Liquid state homogenization and solidification of AlSb were carried out in the Apollo-Soyuz Test Project Experiment MA-044 in the hope that compositional homogeneity would be improved by negating the large density difference between the two constituents. Post-flight analysis and comparative characterization of the space-processed and ground-processed samples indicate that there are major homogeneity improvements in the low-gravity solidified material.

  8. A Comparison of Homogeneous and Multi-layered Berm Breakwaters with Respect to Overtopping and Stability

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Skals, Kasper; Burcharth, Hans F.

    2008-01-01

    The paper deals with homogeneous and multi-layer berm breakwaters designed to maximize the utilization of the quarry material. Two wide stone classes are typically used for berm breakwaters with a homogeneous berm.......The paper deals with homogeneous and multi-layer berm breakwaters designed to maximize the utilization of the quarry material. Two wide stone classes are typically used for berm breakwaters with a homogeneous berm....

  9. Heparin interferes with the radioenzymatic and homogeneous enzyme immunoassays for aminoglycosides

    International Nuclear Information System (INIS)

    Krogstad, D.J.; Granich, G.G.; Murray, P.R.; Pfaller, M.A.; Valdes, R.

    1981-01-01

    Heparin interferes with measurement of aminoglycosides in serum by biological, radioenzymatic, and homogeneous enzyme immunoassay techniques, but not with radioimmunoassay. At concentrations greater than or equal to 10 5 and greater than or equal to 3 X 10 6 USP units/L, respectively, it interferes with the radioenzymatic assay by inhibiting the gentamicin 3-acetyltransferase and kanamycin 6'-acetyltransferase enzymes used in the assay. It interferes with the homogeneous enzyme immunoassays for gentamicin and tobramycin (at concentrations greater than or equal to 10 5 and greater than or equal to10 4 USP units/L, respectively), but not with the commercially available homogeneous enzyme immunoassays for other drugs. Heparin interference with the homogeneous enzyme immunoassay for aminoglycosides requires both the heparin polyanion and glucose-6-phosphate dehydrogenase bound to a cationic aminoglycoside. This interference can be reproduced with dextran sulfate (but not dextran), and does not occur with free enzyme (glucose-6-phosphate dehydrogenase) alone. Heparin interference with these two assays and at concentrations that may be present in intravenous infusions or in seriously underfilled blood-collection tubes is described

  10. How the Spectre of Societal Homogeneity Undermines Equitable Healthcare for Refugees

    Science.gov (United States)

    Razum, Oliver; Wenner, Judith; Bozorgmehr, Kayvan

    2017-01-01

    Recourse to a purported ideal of societal homogeneity has become common in the context of the refugee reception crisis – not only in Japan, as Leppold et al report, but also throughout Europe. Calls for societal homogeneity in Europe originate from populist movements as well as from some governments. Often, they go along with reduced social support for refugees and asylum seekers, for example in healthcare provision. The fundamental right to health is then reduced to a citizens’ right, granted fully only to nationals. Germany, in spite of welcoming many refugees in 2015, is a case in point: entitlement and access to healthcare for asylum seekers are restricted during the first 15 months of their stay. We show that arguments brought forward to defend such restrictions do not hold, particularly not those which relate to maintaining societal homogeneity. European societies are not homogeneous, irrespective of migration. But as migration will continue, societies need to invest in what we call "globalization within." Removing entitlement restrictions and access barriers to healthcare for refugees and asylum seekers is one important element thereof. PMID:28812828

  11. Study on critical effect in lattice homogenization via Monte Carlo method

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    In contrast to the traditional deterministic lattice codes, generating the homogenization multigroup constants via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum. thus provides more accuracy parameters. An infinite lattice of identical symmetric motives is usually assumed when performing the homogenization. However, the finite size of a reactor is reality and it should influence the lattice calculation. In practice of the homogenization with Monte Carlo method, B N theory is applied to take the leakage effect into account. The fundamental mode with the buckling B is used as a measure of the finite size. The critical spectrum in the solution of 0-dimensional fine-group B 1 equations is used to correct the weighted spectrum for homogenization. A PWR prototype core is examined to verify that the presented method indeed generates few group constants effectively. In addition, a zero power physical experiment verification is performed. The results show that B N theory is adequate for leakage correction in the multigroup constants generation via Monte Carlo method. (authors)

  12. Homogeneous dielectric barrier discharges in atmospheric air and its influencing factor

    Science.gov (United States)

    Ran, Junxia; Li, Caixia; Ma, Dong; Luo, Haiyun; Li, Xiaowei

    2018-03-01

    The stable homogeneous dielectric barrier discharge (DBD) is obtained in atmospheric 2-3 mm air gap. It is generated using center frequency 1 kHz high voltage power supply between two plane parallel electrodes with specific alumina ceramic plates as the dielectric barriers. The discharge characteristics are studied by a measurement of its electrical discharge parameters and observation of its light emission phenomena. The results show that a large single current pulse of about 200 μs duration appearing in each voltage pulse, and its light emission is radially homogeneous and covers the entire surface of the two electrodes. The homogeneous discharge generated is a Townsend discharge during discharge. The influences of applied barrier, its thickness, and surface roughness on the transition of discharge modes are studied. The results show that it is difficult to produce a homogeneous discharge using smooth plates or alumina plate surface roughness Ra material, dielectric thickness, and dielectric surface roughness should be used, and proper applied voltage amplitude and frequency should also be used.

  13. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape.

    Science.gov (United States)

    Carneiro, Magda Silva; Campos, Caroline Cambraia Furtado; Beijo, Luiz Alberto; Ramos, Flavio Nunes

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  14. The Application of Homogenate and Filtrate from Baltic Seaweeds in Seedling Growth Tests

    Directory of Open Access Journals (Sweden)

    Izabela Michalak

    2017-02-01

    Full Text Available Algal filtrate and homogenate, obtained from Baltic seaweeds, were applied in seedling growth tests. Radish seeds were used in order to assess algal products phytotoxicity and their biostimulant effect on growth and nutrient uptake. Algal filtrate, at concentrations ranging from 5.0% to 100% was used for seed soaking and as a liquid biostimulant (soil and foliar application. Algal homogenate was developed for seed coating. Algal filtrate and homogenate were also enriched with Zn(II ions in order to examine the influence on metal ion complexation. The optimal doses of algal filtrate and homogenate, as well as soaking time were established. Multi-elemental analyses of the raw biomass, filtrate, homogenate, and radish were also performed using ICP-OES (Inductively Coupled Plasma—Optical Emission Spectrometry. The best results in terms of seedlings’ length and weight were obtained using clear filtrate at a concentration of 50% applied to the soil and for homogenate applied at a dose of 50 mg/g of seeds. Clear filtrate at a concentration of 50% used for seed soaking for one hour showed the best results. The applied algal products increased the content of elements in seedlings. Among the tested products, a concentration of 50% algal filtrate is recommended for future pot and field experiments.

  15. Le recours aux modeles dans l'enseignement de la biologie au secondaire : Conceptions d'enseignantes et d'enseignants et modes d'utilisation

    Science.gov (United States)

    Varlet, Madeleine

    Le recours aux modeles et a la modelisation est mentionne dans la documentation scientifique comme un moyen de favoriser la mise en oeuvre de pratiques d'enseignement-apprentissage constructivistes pour pallier les difficultes d'apprentissage en sciences. L'etude prealable du rapport des enseignantes et des enseignants aux modeles et a la modelisation est alors pertinente pour comprendre leurs pratiques d'enseignement et identifier des elements dont la prise en compte dans les formations initiale et disciplinaire peut contribuer au developpement d'un enseignement constructiviste des sciences. Plusieurs recherches ont porte sur ces conceptions sans faire de distinction selon les matieres enseignees, telles la physique, la chimie ou la biologie, alors que les modeles ne sont pas forcement utilises ou compris de la meme maniere dans ces differentes disciplines. Notre recherche s'est interessee aux conceptions d'enseignantes et d'enseignants de biologie au secondaire au sujet des modeles scientifiques, de quelques formes de representations de ces modeles ainsi que de leurs modes d'utilisation en classe. Les resultats, que nous avons obtenus au moyen d'une serie d'entrevues semi-dirigees, indiquent que globalement leurs conceptions au sujet des modeles sont compatibles avec celle scientifiquement admise, mais varient quant aux formes de representations des modeles. L'examen de ces conceptions temoigne d'une connaissance limitee des modeles et variable selon la matiere enseignee. Le niveau d'etudes, la formation prealable, l'experience en enseignement et un possible cloisonnement des matieres pourraient expliquer les differentes conceptions identifiees. En outre, des difficultes temporelles, conceptuelles et techniques peuvent freiner leurs tentatives de modelisation avec les eleves. Toutefois, nos resultats accreditent l'hypothese que les conceptions des enseignantes et des enseignants eux-memes au sujet des modeles, de leurs formes de representation et de leur approche

  16. A new consistent definition of the homogenized diffusion coefficient of a lattice, limitations of the homogenization concept, and discussion of previously defined coefficients

    International Nuclear Information System (INIS)

    Deniz, V.C.

    1978-01-01

    The problem concerned with the correct definition of the homogenized diffusion coefficient of a lattice, and the concurrent problem of whether or not a homogenized diffusion equation can be formally set up, is studied by a space-energy angle dependent treatment for a general lattice cell; using an operator notation which applies to any eigen-problem. It is shown that the diffusion coefficient should represent only leakage effects. A new definition of the diffusion coefficient is given, which combines within itself the individual merits of each of the two definitions of Benoist, and reduces to the 'uncorrected' Benoist coefficient in certain cases. The conditions under which a homogenized diffusion equation can be obtained are discussed. A compatison is made between the approach via a diffusion equation and the approach via the eigen-coefficients of Deniz. Previously defined diffusion coefficients are discussed, and it is shown that the transformed eigen-coefficients proposed by Gelbard and by Larsen are unsuitable as diffusion coefficients, and that the cell-edge normalization of the Bonalumi coefficient is not physically justifiable. (author)

  17. Homogeneous Slowpoke reactor for the production of radio-isotope: a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busetta, P.; Bonin, H.W. [Royal Military College of Canada, Kingston, Ontario (Canada)

    2006-09-15

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous react will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB(r). It was found that it is needed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  18. Is charity a homogeneous good?

    OpenAIRE

    Backus, Peter

    2010-01-01

    In this paper I estimate income and price elasticities of donations to six different charitable causes to test the assumption that charity is a homogeneous good. In the US, charitable donations can be deducted from taxable income. This has long been recognized as producing a price, or taxprice, of giving equal to one minus the marginal tax rate faced by the donor. A substantial portion of the economic literature on giving has focused on estimating price and income elasticities of giving as th...

  19. Inverse acoustic problem of N homogeneous scatterers

    DEFF Research Database (Denmark)

    Berntsen, Svend

    2002-01-01

    The three-dimensional inverse acoustic medium problem of N homogeneous objects with known geometry and location is considered. It is proven that one scattering experiment is sufficient for the unique determination of the complex wavenumbers of the objects. The mapping from the scattered fields...

  20. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  1. Homogeneous nucleation of water in synthetic air

    NARCIS (Netherlands)

    Fransen, M.A.L.J.; Sachteleben, E.; Hruby, J.; Smeulders, D.M.J.; DeMott, P.J.; O'Dowd, C.D.

    2013-01-01

    Homogeneous nucleation rates for water vapor in synthetic air are measured by means of a Pulse-Expansion Wave Tube (PEWT). A comparison of the experimental nucleation rates with the Classical Nucleation Theory (CNT) shows that a more elaborated model is necessary to describe supercooled water

  2. Quasi-single-mode homogeneous 31-core fibre

    DEFF Research Database (Denmark)

    Sasaki, Y.; Saitoh, S.; Amma, Y.

    2015-01-01

    A homogeneous 31-core fibre with a cladding diameter of 230 μm for quasi-single-mode transmission is designed and fabricated. LP01-crosstalk of -38.4 dB/11 km at 1550 nm is achieved by using few-mode trench-assisted cores....

  3. Effects of homogenization treatment on recrystallization behavior of 7150 aluminum sheet during post-rolling annealing

    International Nuclear Information System (INIS)

    Guo, Zhanying; Zhao, Gang; Chen, X.-Grant

    2016-01-01

    The effects of two homogenization treatments applied to the direct chill (DC) cast billet on the recrystallization behavior in 7150 aluminum alloy during post-rolling annealing have been investigated using the electron backscatter diffraction (EBSD) technique. Following hot and cold rolling to the sheet, measured orientation maps, the recrystallization fraction and grain size, the misorientation angle and the subgrain size were used to characterize the recovery and recrystallization processes at different annealing temperatures. The results were compared between the conventional one-step homogenization and the new two-step homogenization, with the first step being pretreated at 250 °C. Al_3Zr dispersoids with higher densities and smaller sizes were obtained after the two-step homogenization, which strongly retarded subgrain/grain boundary mobility and inhibited recrystallization. Compared with the conventional one-step homogenized samples, a significantly lower recrystallized fraction and a smaller recrystallized grain size were obtained under all annealing conditions after cold rolling in the two-step homogenized samples. - Highlights: • Effects of two homogenization treatments on recrystallization in 7150 Al sheets • Quantitative study on the recrystallization evolution during post-rolling annealing • Al_3Zr dispersoids with higher densities and smaller sizes after two-step treatment • Higher recrystallization resistance of 7150 sheets with two-step homogenization

  4. Lattice Boltzmann model for three-dimensional decaying homogeneous isotropic turbulence

    International Nuclear Information System (INIS)

    Xu Hui; Tao Wenquan; Zhang Yan

    2009-01-01

    We implement a lattice Boltzmann method (LBM) for decaying homogeneous isotropic turbulence based on an analogous Galerkin filter and focus on the fundamental statistical isotropic property. This regularized method is constructed based on orthogonal Hermite polynomial space. For decaying homogeneous isotropic turbulence, this regularized method can simulate the isotropic property very well. Numerical studies demonstrate that the novel regularized LBM is a promising approximation of turbulent fluid flows, which paves the way for coupling various turbulent models with LBM

  5. Homogeneity in Luxury Fashion Consumption: an Exploration of Arab Women

    OpenAIRE

    Marciniak, R.; Gad Mohsen, Marwa

    2014-01-01

    Consumer perceptions and consumer motivations are complex and whilst it is acknowledged within literature\\ud that heterogeneity exists, homogenous models dominate consumer behaviour research. The primary purpose of this\\ud paper is to explore the extent to which Arab women are a homogeneous group of consumers in regard to perceptions\\ud and motivations to consume luxury fashion goods. In particular, the paper seeks to present a critical review of luxury consumption frameworks. As part of the ...

  6. Variable valve timing in a homogenous charge compression ignition engine

    Science.gov (United States)

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  7. Method of the characteristics for calculation of VVER without homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Suslov, I.R.; Komlev, O.G.; Novikova, N.N.; Zemskov, E.A.; Tormyshev, I.V.; Melnikov, K.G.; Sidorov, E.B. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    2005-07-01

    The first stage of the development of characteristics code MCCG3D for calculation of the VVER-type reactor without homogenization is presented. The parallel version of the code for MPI was developed and tested on cluster PC with LINUX-OS. Further development of the MCCG3D code for design-level calculations with full-scale space-distributed feedbacks is discussed. For validation of the MCCG3D code we use the critical assembly VENUS-2. The geometrical models with and without homogenization have been used. With both models the MCCG3D results agree well with the experimental power distribution and with results generated by the other codes, but model without homogenization provides better results. The perturbation theory for MCCG3D code is developed and implemented in the module KEFSFGG. The calculations with KEFSFGG are in good agreement with direct calculations. (authors)

  8. Recovering a time-homogeneous stock price process from perpetual option prices

    OpenAIRE

    Ekström, Erik; Hobson, David

    2009-01-01

    It is well known how to determine the price of perpetual American options if the underlying stock price is a time-homogeneous diffusion. In the present paper we consider the inverse problem, that is, given prices of perpetual American options for different strikes, we show how to construct a time-homogeneous stock price model which reproduces the given option prices.

  9. Lipid peroxidation in liver homogenates. Effects of membrane lipid composition and irradiation

    International Nuclear Information System (INIS)

    Vaca, C.; Ringdahl, M.H.

    1984-01-01

    The rate of lipid peroxidation has been followed in whole liver homogenates from mice using the TBA-method. Liver homogenates with different membrane fatty acid composition were obtained from mice fed diets containing different sources of fat i.e. sunflower seed oil (S), coconut oil (C) and hydrogenated lard (L). The yields of the TBA-chromophore (TBA-c) were 4 times higher in the liver homogenates S compared to C and L after 4 hour incubation at 37 0 C. Irradiation of the liver homogenates before incubation inhibited the formation of lipid peroxidation products in a dose dependent way. The catalytic capacity of the homogenates was investigated, followed as the autooxidation of cysteamine or modified by addition of the metal chelator EDTA. The rate of autooxidation of cysteamine, which is dependent on the presence of metal ions (Fe/sup 2+/ or Cu/sup 2+/), was decreased with increasing dose, thus indicating an alteration in the availability of metal catalysts in the system. The addition of Fe/sup 2+/ to the system restored the lipid peroxidation yields in the irradiated systems and the presence of EDTA inhibited the formation of lipid peroxidation products in all three dietary groups. It is suggested that irradiation alters the catalytic activity needed in the autooxidation processes of polyunsaturated fatty acids

  10. Infinite dimensional spherical analysis and harmonic analysis for groups acting on homogeneous trees

    DEFF Research Database (Denmark)

    Axelgaard, Emil

    In this thesis, we study groups of automorphisms for homogeneous trees of countable degree by using an inductive limit approach. The main focus is the thourough discussion of two Olshanski spherical pairs consisting of automorphism groups for a homogeneous tree and a homogeneous rooted tree, resp...... finite. Finally, we discuss conditionally positive definite functions on the groups and use the generalized Bochner-Godement theorem for Olshanski spherical pairs to prove Levy-Khinchine formulas for both of the considered pairs....

  11. Exploring cosmic homogeneity with the BOSS DR12 galaxy sample

    Energy Technology Data Exchange (ETDEWEB)

    Ntelis, Pierros; Hamilton, Jean-Christophe; Busca, Nicolas Guillermo; Aubourg, Eric [APC, Université Paris Diderot-Paris 7, CNRS/IN2P3, CEA, Observatoire de Paris, 10, rue A. Domon and L. Duquet, Paris (France); Goff, Jean-Marc Le; Burtin, Etienne; Laurent, Pierre; Rich, James; Bourboux, Hélion du Mas des; Delabrouille, Nathalie Palanque [CEA, Centre de Saclay, IRFU/SPP, F-91191 Gif-sur-Yvette (France); Tinker, Jeremy [Department of Physics and Center for Cosmology and Particle Physics, New York University, 726 Broadway, New York (United States); Bautista, Julian [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Delubac, Timothée [Laboratoire d' astrophysique, Ecole Polytechnique Fédérale de Lausanne (EPFL), Observatoire de Sauverny, CH-1290 Versoix (Switzerland); Eftekharzadeh, Sarah; Myers, Adam [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, Meyer Hall of Physics, New York, NY 10003 (United States); Vargas-Magaña, Mariana [Instituto de Física, Universidad Nacional Autónoma de México, Apdo. Postal 20-364, México (Mexico); Pâris, Isabelle [Aix Marseille Universite, CNRS, LAM (Laboratoire d' Astrophysique de Marseille) UMR 7326, 13388, Marseille (France); Petitjean, Partick [Institut d' Astrophysique de Paris, CNRS-UPMC, UMR7095, 98bis bd Arago, Paris, 75014 France (France); Rossi, Graziano, E-mail: pntelis@apc.in2p3.fr, E-mail: jchamilton75@gmail.com [Department of Astronomy and Space Science, Sejong University, Seoul, 143-747 (Korea, Republic of); and others

    2017-06-01

    In this study, we probe the transition to cosmic homogeneity in the Large Scale Structure (LSS) of the Universe using the CMASS galaxy sample of BOSS spectroscopic survey which covers the largest effective volume to date, 3 h {sup −3} Gpc{sup 3} at 0.43 ≤ z ≤ 0.7. We study the scaled counts-in-spheres, N(< r ), and the fractal correlation dimension, D{sub 2}( r ), to assess the homogeneity scale of the universe using a Landy and Szalay inspired estimator. Defining the scale of transition to homogeneity as the scale at which D{sub 2}( r ) reaches 3 within 1%, i.e. D{sub 2}( r )>2.97 for r >R {sub H} , we find R {sub H} = (63.3±0.7) h {sup −1} Mpc, in agreement at the percentage level with the predictions of the ΛCDM model R {sub H} =62.0 h {sup −1} Mpc. Thanks to the large cosmic depth of the survey, we investigate the redshift evolution of the transition to homogeneity scale and find agreement with the ΛCDM prediction. Finally, we find that D{sub 2} is compatible with 3 at scales larger than 300 h {sup −1} Mpc in all redshift bins. These results consolidate the Cosmological Principle and represent a precise consistency test of the ΛCDM model.

  12. Topology of actions and homogeneous spaces

    International Nuclear Information System (INIS)

    Kozlov, Konstantin L

    2013-01-01

    Topologization of a group of homeomorphisms and its action provide additional possibilities for studying the topological space, the group of homeomorphisms, and their interconnections. The subject of the paper is the use of the property of d-openness of an action (introduced by Ancel under the name of weak micro-transitivity) in the study of spaces with various forms of homogeneity. It is proved that a d-open action of a Čech-complete group is open. A characterization of Polish SLH spaces using d-openness is given, and it is established that any separable metrizable SLH space has an SLH completion that is a Polish space. Furthermore, the completion is realized in coordination with the completion of the acting group with respect to the two-sided uniformity. A sufficient condition is given for extension of a d-open action to the completion of the space with respect to the maximal equiuniformity with preservation of d-openness. A result of van Mill is generalized, namely, it is proved that any homogeneous CDH metrizable compactum is the only G-compactification of the space of rational numbers for the action of some Polish group. Bibliography: 39 titles.

  13. Homogeneous SiGe crystal growth in microgravity by the travelling liquidus-zone method

    International Nuclear Information System (INIS)

    Kinoshita, K; Arai, Y; Inatomi, Y; Sakata, K; Takayanagi, M; Yoda, S; Miyata, H; Tanaka, R; Sone, T; Yoshikawa, J; Kihara, T; Shibayama, H; Kubota, Y; Shimaoka, T; Warashina, Y

    2011-01-01

    Homogeneous SiGe crystal growth experiments will be performed on board the ISS 'Kibo' using a gradient heating furnace (GHF). A new crystal growth method invented for growing homogeneous mixed crystals named 'travelling liquidus-zone (TLZ) method' is evaluated by the growth of Si 0.5 Ge 0.5 crystals in space. We have already succeeded in growing homogeneous 2mm diameter Si 0.5 Ge 0.5 crystals on the ground but large diameter homogeneous crystals are difficult to be grown due to convection in a melt. In microgravity, larger diameter crystals can be grown with suppressing convection. Radial concentration profiles as well as axial profiles in microgravity grown crystals will be measured and will be compared with our two-dimensional TLZ growth model equation and compositional variation is analyzed. Results are beneficial for growing large diameter mixed crystals by the TLZ method on the ground. Here, we report on the principle of the TLZ method for homogeneous crystal growth, results of preparatory experiments on the ground and plan for microgravity experiments.

  14. Cosmic Ray Hit Detection with Homogenous Structures

    Science.gov (United States)

    Smirnov, O. M.

    Cosmic ray (CR) hits can affect a significant number of pixels both on long-exposure ground-based CCD observations and on the Space Telescope frames. Thus, methods of identifying the damaged pixels are an important part of the data preprocessing for practically any application. The paper presents an implementation of a CR hit detection algorithm based on a homogenous structure (also called cellular automata ), a concept originating in artificial intelligence and dicrete mathematics. Each pixel of the image is represented by a small automaton, which interacts with its neighbors and assumes a distinct state if it ``decides'' that a CR hit is present. On test data, the algorithm has shown a high detection rate (~0.7 ) and a low false alarm rate (frame. A homogenous structure is extremely trainable, which can be very important for processing large batches of data obtained under similar conditions. Training and optimizing issues are discussed, as well as possible other applications of this concept to image processing.

  15. Homogenized group cross sections by Monte Carlo

    International Nuclear Information System (INIS)

    Van Der Marck, S. C.; Kuijper, J. C.; Oppe, J.

    2006-01-01

    Homogenized group cross sections play a large role in making reactor calculations efficient. Because of this significance, many codes exist that can calculate these cross sections based on certain assumptions. However, the application to the High Flux Reactor (HFR) in Petten, the Netherlands, the limitations of such codes imply that the core calculations would become less accurate when using homogenized group cross sections (HGCS). Therefore we developed a method to calculate HGCS based on a Monte Carlo program, for which we chose MCNP. The implementation involves an addition to MCNP, and a set of small executables to perform suitable averaging after the MCNP run(s) have completed. Here we briefly describe the details of the method, and we report on two tests we performed to show the accuracy of the method and its implementation. By now, this method is routinely used in preparation of the cycle to cycle core calculations for HFR. (authors)

  16. Core homogenization method for pebble bed reactors

    International Nuclear Information System (INIS)

    Kulik, V.; Sanchez, R.

    2005-01-01

    This work presents a core homogenization scheme for treating a stochastic pebble bed loading in pebble bed reactors. The reactor core is decomposed into macro-domains that contain several pebble types characterized by different degrees of burnup. A stochastic description is introduced to account for pebble-to-pebble and pebble-to-helium interactions within a macro-domain as well as for interactions between macro-domains. Performance of the proposed method is tested for the PROTEUS and ASTRA critical reactor facilities. Numerical simulations accomplished with the APOLLO2 transport lattice code show good agreement with the experimental data for the PROTEUS reactor facility and with the TRIPOLI4 Monte Carlo simulations for the ASTRA reactor configuration. The difference between the proposed method and the traditional volume-averaged homogenization technique is negligible while only one type of fuel pebbles present in the system, but it grows rapidly with the level of pebble heterogeneity. (authors)

  17. Irregular Homogeneity Domains in Ternary Intermetallic Systems

    Directory of Open Access Journals (Sweden)

    Jean-Marc Joubert

    2015-12-01

    Full Text Available Ternary intermetallic A–B–C systems sometimes have unexpected behaviors. The present paper examines situations in which there is a tendency to simultaneously form the compounds ABx, ACx and BCx with the same crystal structure. This causes irregular shapes of the phase homogeneity domains and, from a structural point of view, a complete reversal of site occupancies for the B atom when crossing the homogeneity domain. This work reviews previous studies done in the systems Fe–Nb–Zr, Hf–Mo–Re, Hf–Re–W, Mo–Re–Zr, Re–W–Zr, Cr–Mn–Si, Cr–Mo–Re, and Mo–Ni–Re, and involving the topologically close-packed Laves, χ and σ phases. These systems have been studied using ternary isothermal section determination, DFT calculations, site occupancy measurement using joint X-ray, and neutron diffraction Rietveld refinement. Conclusions are drawn concerning this phenomenon. The paper also reports new experimental or calculated data on Co–Cr–Re and Fe–Nb–Zr systems.

  18. Genetic homogeneity of Fascioloides magna in Austria.

    Science.gov (United States)

    Husch, Christian; Sattmann, Helmut; Hörweg, Christoph; Ursprung, Josef; Walochnik, Julia

    2017-08-30

    The large American liver fluke, Fascioloides magna, is an economically relevant parasite of both domestic and wild ungulates. F. magna was repeatedly introduced into Europe, for the first time already in the 19th century. In Austria, a stable population of F. magna has established in the Danube floodplain forests southeast of Vienna. The aim of this study was to determine the genetic diversity of F. magna in Austria. A total of 26 individuals from various regions within the known area of distribution were investigated for their cytochrome oxidase subunit 1 (cox1) and nicotinamide dehydrogenase subunit 1 (nad1) gene haplotypes. Interestingly, all 26 individuals revealed one and the same haplotype, namely concatenated haplotype Ha5. This indicates a homogenous population of F. magna in Austria and may argue for a single introduction. Alternatively, genetic homogeneity might also be explained by a bottleneck effect and/or genetic drift. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Homogeneous CdTe quantum dots-carbon nanotubes heterostructures

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Kayo Oliveira [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Bettini, Jefferson [Laboratório Nacional de Nanotecnologia, Centro Nacional de Pesquisa em Energia e Materiais, CEP 13083-970, Campinas, SP (Brazil); Ferrari, Jefferson Luis [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Schiavon, Marco Antonio, E-mail: schiavon@ufsj.edu.br [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil)

    2015-01-15

    The development of homogeneous CdTe quantum dots-carbon nanotubes heterostructures based on electrostatic interactions has been investigated. We report a simple and reproducible non-covalent functionalization route that can be accomplished at room temperature, to prepare colloidal composites consisting of CdTe nanocrystals deposited onto multi-walled carbon nanotubes (MWCNTs) functionalized with a thin layer of polyelectrolytes by layer-by-layer technique. Specifically, physical adsorption of polyelectrolytes such as poly (4-styrene sulfonate) and poly (diallyldimethylammonium chloride) was used to deagglomerate and disperse MWCNTs, onto which we deposited CdTe quantum dots coated with mercaptopropionic acid (MPA), as surface ligand, via electrostatic interactions. Confirmation of the CdTe quantum dots/carbon nanotubes heterostructures was done by transmission and scanning electron microscopies (TEM and SEM), dynamic-light scattering (DLS) together with absorption, emission, Raman and infrared spectroscopies (UV–vis, PL, Raman and FT-IR). Almost complete quenching of the PL band of the CdTe quantum dots was observed after adsorption on the MWCNTs, presumably through efficient energy transfer process from photoexcited CdTe to MWCNTs. - Highlights: • Highly homogeneous CdTe-carbon nanotubes heterostructures were prepared. • Simple and reproducible non-covalent functionalization route. • CdTe nanocrystals homogeneously deposited onto multi-walled carbon nanotubes. • Efficient energy transfer process from photoexcited CdTe to MWCNTs.

  20. Homogeneous slowpoke reactor for the production of radio-isotope. A feasibility study

    International Nuclear Information System (INIS)

    Busatta, P.; Bonin, H.

    2005-01-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP 5 simulation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether the natural convection will still effectively cool the reactor using the modeling software FEMLAB. The MCNP 5 simulation code was validated by using a simulation with WIMS-AECL code. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  1. Homogeneous slowpoke reactor for the production of radio-isotope. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busatta, P.; Bonin, H. [Royal Military College of Canada, Kingston, Ontario (Canada)]. E-mail: paul.busatta@rmc.ca; bonin-h@rmc.ca

    2005-07-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP 5 simulation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether the natural convection will still effectively cool the reactor using the modeling software FEMLAB. The MCNP 5 simulation code was validated by using a simulation with WIMS-AECL code. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  2. Quantum groups and quantum homogeneous spaces

    International Nuclear Information System (INIS)

    Kulish, P.P.

    1994-01-01

    The usefulness of the R-matrix formalism and the reflection equations is demonstrated on examples of the quantum group covariant algebras (quantum homogeneous spaces): quantum Minkowski space-time, quantum sphere and super-sphere. The irreducible representations of some covariant algebras are constructed. The generalization of the reflection equation to super case is given and the existence of the quasiclassical limits is pointed out. (orig.)

  3. Fluoroscopic screen which is optically homogeneous

    International Nuclear Information System (INIS)

    1975-01-01

    A high efficiency fluoroscopic screen for X-ray examination consists of an optically homogeneous crystal plate of fluorescent material such as activated cesium iodide, supported on a transparent protective plate, with the edges of the assembly beveled and optically coupled to a light absorbing compound. The product is dressed to the desired thickness and provided with an X-ray-transparent light-opaque cover. (Auth.)

  4. Wave propagation phenomena in structured materials and problems of metamaterials homogenization

    DEFF Research Database (Denmark)

    Lavrinenko, Andrei

    2011-01-01

    One of the most convenient ways to describe metamaterials (MM) is to homogenize structured composites and assign them with effective parameters (EPs), provided that they can be introduced. The most common way to determine EPs in literature is to derive them from the refection/transmission spectra......-processing to retrieve EPs. We demonstrate our approach on several characteristic examples and formulate constrains on the MMs homogenization....

  5. Extension theorems for homogenization on lattice structures

    Science.gov (United States)

    Miller, Robert E.

    1992-01-01

    When applying homogenization techniques to problems involving lattice structures, it is necessary to extend certain functions defined on a perforated domain to a simply connected domain. This paper provides general extension operators which preserve bounds on derivatives of order l. Only the special case of honeycomb structures is considered.

  6. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  7. Time travel in the homogeneous Som-Raychaudhuri Universe

    International Nuclear Information System (INIS)

    Paiva, F.M.; Reboucas, M.J.; Teixeira, A.F.F.

    1987-01-01

    Properties of the rotating Som-Raychaudhuri homogeneous space-time are investigated: time-like and null geodesics, causality features, horizons and invariant characterization. An integral representation of its five isometries is also discussed. (author) [pt

  8. Jordan's algebra of a facially homogeneous autopolar cone

    International Nuclear Information System (INIS)

    Bellissard, Jean; Iochum, Bruno

    1979-01-01

    It is shown that a Jordan-Banach algebra with predual may be canonically associated with a facially homogeneous autopolar cone. This construction generalizes the case where a trace vector exists in the cone [fr

  9. Homogeneous magnetic relaxation in iron-yttrium garnets in the vicinity of a phase transition

    International Nuclear Information System (INIS)

    Luzyanin, I.D.; Khavronin, V.P.

    1977-01-01

    Results are presented of an experimental investigation of the dynamics of homogeneous magnetization during a phase transition of the second kind in iron-yttrium garnet (IYG) single crystals of various shapes. It is shown that homogeneous relaxation significantly depends on both the magnitude of 4πchisub(st) (chisub(st) is static magnetic susceptibility) as well as on the relation between the variable field frequency (at which the investigation is carried out) and the characteristic energies. It is shown that beginning from temperatures such as 4πchisub(st) approximately 1, the characteristic dipole interaction energy becomes frequency dependent; this indicates that in this case Lorentz coupling between the dynamic susceptibility and homogeneous relaxation time is invalid. This is a principle point in investigations of homogeneous relaxation by radio-frequency techniques. The temperature dependence of the homogeneous relaxation time and static susceptibility is determined in the exchange region. It is found that the phase transition in IYG involves anomalous phenomena which manifest in release and absorption of heat by a sample and in the appearance of additional singularities in the temperature dependence of the homogeneous relaxation time

  10. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  11. Regional homogeneity of electoral space: comparative analysis (on the material of 100 national cases

    Directory of Open Access Journals (Sweden)

    A. O. Avksentiev

    2015-12-01

    Full Text Available In the article the author examines dependence on electoral behavior from territorial belonging. «Regional homogeneity» and «electoral space» categories are conceptualized. It is argued, that such regional homogeneity is a characteristic of electoral space and can be quantified. Quantitative measurement of government regional homogeneity has direct connection with risk of separatism, civil conflicts, or legitimacy crisis on deviant territories. It is proposed the formulae for evaluation of regional homogeneity quantitative method which has been based on statistics analysis instrument, especially, variation coefficient. Possible directions of study with the use of this index according to individual political subjects and the whole political space (state, region, electoral district are defined. Calculation of appropriate indexes for Ukrainian electoral space (return of 1991­2015 elections and 100 other national cases. The dynamics of Ukraine regional homogeneity on the material of 1991­2015 electoral statistics is analyzed.

  12. Investigation of methods for hydroclimatic data homogenization

    Science.gov (United States)

    Steirou, E.; Koutsoyiannis, D.

    2012-04-01

    We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, 'SNHT for single shifts', was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

  13. Some variance reduction methods for numerical stochastic homogenization.

    Science.gov (United States)

    Blanc, X; Le Bris, C; Legoll, F

    2016-04-28

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. © 2016 The Author(s).

  14. Non-Almost Periodicity of Parallel Transports for Homogeneous Connections

    International Nuclear Information System (INIS)

    Brunnemann, Johannes; Fleischhack, Christian

    2012-01-01

    Let A be the affine space of all connections in an SU(2) principal fibre bundle over ℝ 3 . The set of homogeneous isotropic connections forms a line l in A. We prove that the parallel transports for general, non-straight paths in the base manifold do not depend almost periodically on l. Consequently, the embedding l ↪ A does not continuously extend to an embedding l-bar ↪ A-bar of the respective compactifications. Here, the Bohr compactification l-bar corresponds to the configuration space of homogeneous isotropic loop quantum cosmology and A-bar to that of loop quantum gravity. Analogous results are given for the anisotropic case.

  15. A critical review of homogenization techniques in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1983-01-01

    The determination of the shape of the neutron flux in a whole reactor is, at the time being, a much too complex problem to be treated by transport theory. Since the earlier times of reactor theory, the necessity appeared to solve the problem in two steps. First the reactor is divided into zones, each of them forming a regular lattice. In each of these zones, homogenized parameters are determined by transport theory, in order to define an equivalent smeared medium. In a second step, these parameters are introduced in a diffusion theory scheme in order to treat the reactor as a whole. This is the homogenization procedure. 14 refs

  16. Early capillary flux homogenization in response to neural activation.

    Science.gov (United States)

    Lee, Jonghwan; Wu, Weicheng; Boas, David A

    2016-02-01

    This Brief Communication reports early homogenization of capillary network flow during somatosensory activation in the rat cerebral cortex. We used optical coherence tomography and statistical intensity variation analysis for tracing changes in the red blood cell flux over hundreds of capillaries nearly at the same time with 1-s resolution. We observed that while the mean capillary flux exhibited a typical increase during activation, the standard deviation of the capillary flux exhibited an early decrease that happened before the mean flux increase. This network-level data is consistent with the theoretical hypothesis that capillary flow homogenizes during activation to improve oxygen delivery. © The Author(s) 2015.

  17. A critical review of homogenization techniques in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1983-01-01

    The determination of the shape of the neutron flux in a whole reactor is, at the time being, a much too complex problem to be treated by transport theory. Since the earlier times of reactor theory, the necessity appeared to solve the problem in two steps. First the reactor is divided into zones, each of them forming a regular lattice. In each of these zones, homogenized parameters are determined by transport theory, in order to define an equivalent smeared medium. In a second step, these parameters are introduced in a diffusion theory scheme in order to treat the reactor as a whole. This is the homogenization procedure

  18. Influence of dose distribution homogeneity on the tumor control probability in heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan

    2001-01-01

    In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%

  19. A consistent homogenization procedure to obtain few-group cell parameters

    International Nuclear Information System (INIS)

    Pierini, G.

    1979-01-01

    The criterion, according to which one heterogeneous and one homogeneous cell are equivalent if they have the same boundary values of both the flux and the normal components of the current, is used to homogenize radially an axially infinite cylindrical cell, with azimuth independent properties and moderatur adequately described by diffusion theory. The method, which leads to the definition of a full matrix of diffusion coefficients, provides a new and simple definition of the few-group cell parameters, which are nearly independent of the environment. (orig.) [de

  20. Homogenization on Multi-Materials’ Elements: Application to Printed Circuit Boards and Warpage Analysis

    Directory of Open Access Journals (Sweden)

    Araújo Manuel

    2016-01-01

    Full Text Available Multi-material domains are often found in industrial applications. Modelling them can be computationally very expensive due to meshing requirements. The finite element properties comprising different materials are hardly accurate. In this work, a new homogenization method that simplifies the computation of the homogenized Young modulus, Poisson ratio and thermal expansion coefficient is proposed, and applied to composite-like material on a printed circuit board. The results show a good properties correspondence between the homogenized domain and the real geometry simulation.

  1. Optimization and characterization of high pressure homogenization produced chemically modified starch nanoparticles.

    Science.gov (United States)

    Ding, Yongbo; Kan, Jianquan

    2017-12-01

    Chemically modified starch (RS4) nanoparticles were synthesized through homogenization and water-in-oil mini-emulsion cross-linking. Homogenization was optimized with regard to z-average diameter by using a three-factor-three-level Box-Behnken design. Homogenization pressure (X 1 ), oil/water ratio (X 2 ), and surfactant (X 3 ) were selected as independent variables, whereas z-average diameter was considered as a dependent variable. The following optimum preparation conditions were obtained to achieve the minimum average size of these nanoparticles: 50 MPa homogenization pressure, 10:1 oil/water ratio, and 2 g surfactant amount, when the predicted z-average diameter was 303.6 nm. The physicochemical properties of these nanoparticles were also determined. Dynamic light scattering experiments revealed that RS4 nanoparticles measuring a PdI of 0.380 and an average size of approximately 300 nm, which was very close to the predicted z-average diameter (303.6 nm). The absolute value of zeta potential of RS4 nanoparticles (39.7 mV) was higher than RS4 (32.4 mV), with strengthened swelling power. X-ray diffraction results revealed that homogenization induced a disruption in crystalline structure of RS4 nanoparticles led to amorphous or low-crystallinity. Results of stability analysis showed that RS4 nanosuspensions (particle size) had good stability at 30 °C over 24 h.

  2. Physical applications of homogeneous balls

    CERN Document Server

    Scarr, Tzvi

    2005-01-01

    One of the mathematical challenges of modern physics lies in the development of new tools to efficiently describe different branches of physics within one mathematical framework. This text introduces precisely such a broad mathematical model, one that gives a clear geometric expression of the symmetry of physical laws and is entirely determined by that symmetry. The first three chapters discuss the occurrence of bounded symmetric domains (BSDs) or homogeneous balls and their algebraic structure in physics. The book further provides a discussion of how to obtain a triple algebraic structure ass

  3. Quantum-dot-based homogeneous time-resolved fluoroimmunoassay of alpha-fetoprotein

    Energy Technology Data Exchange (ETDEWEB)

    Chen Meijun; Wu Yingsong; Lin Guanfeng; Hou Jingyuan; Li Ming [Institute of Antibody Engineering, School of Biotechnology, Southern Medical University, Guangzhou, 510515 (China); Liu Tiancai, E-mail: liutc@smu.edu.cn [Institute of Antibody Engineering, School of Biotechnology, Southern Medical University, Guangzhou, 510515 (China)

    2012-09-05

    Highlights: Black-Right-Pointing-Pointer QDs-based homogeneous time-resolved fluoroimmunoassay was developed to detect AFP. Black-Right-Pointing-Pointer The conjugates were prepared with QDs-doped microspheres and anti-AFP McAb. Black-Right-Pointing-Pointer The conjugates were prepared with LTCs and another anti-AFP McAb. Black-Right-Pointing-Pointer Excess amounts of conjugates were used for detecting AFP without rinsing. Black-Right-Pointing-Pointer The wedding of QPs and LTCs was suitable for HTRFIA to detect AFP. - Abstract: Quantum dots (QDs) with novel photoproperties are not widely used in clinic diagnosis, and homogeneous time-resolved fluorescence assays possess many advantages over current methods for alpha-fetoprotein (AFP) detection. A novel QD-based homogeneous time-resolved fluorescence assay was developed and used for detection of AFP, a primary marker for many cancers and diseases. QD-doped carboxyl-modified polystyrene microparticles (QPs) were prepared by doping oil-soluble QDs possessing a 605 nm emission peak. The antibody conjugates (QPs-E014) were prepared from QPs and an anti-AFP monoclonal antibody, and luminescent terbium chelates (LTCs) were prepared and conjugated to a second anti-AFP monoclonal antibody (LTCs-E010). In a double-antibodies sandwich structure, QPs-E014 and LTCs-E010 were used for detection of AFP, serving as energy acceptor and donor, respectively, with an AFP bridge. The results demonstrated that the luminescence lifetime of these QPs was sufficiently long for use in a time-resolved fluoroassay, with the efficiency of time-resolved Foerster resonance transfer (TR-FRET) at 67.3% and the spatial distance of the donor to acceptor calculated to be 66.1 Angstrom-Sign . Signals from TR-FRET were found to be proportional to AFP concentrations. The resulting standard curve was log Y = 3.65786 + 0.43863{center_dot}log X (R = 0.996) with Y the QPs fluorescence intensity and X the AFP concentration; the calculated sensitivity was 0

  4. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems.

    Science.gov (United States)

    Ye, Ran; Harte, Federico

    2014-03-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions.

  5. Design of Stirrer Impeller with Variable Operational Speed for a Food Waste Homogenizer

    Directory of Open Access Journals (Sweden)

    Idris A. Kayode

    2016-05-01

    Full Text Available A conceptualized impeller called KIA is designed for impact agitation of food waste in a homogenizer. A comparative analysis of the performance of KIA is made with three conventional impeller types, Rushton, Anchor, and Pitched Blade. Solid–liquid mixing of a moisture-rich food waste is simulated under various operational speeds, in order to compare the dispersions and thermal distributions at homogenous slurry conditions. Using SolidWorks, the design of the impellers employs an Application Programming Interface (API which acts as the canvas for creating a graphical user interface (GUI for automation of its assembly. A parametric analysis of the homogenizer, at varying operational speeds, enables the estimation of the critical speed of the mixing shaft diameter and the deflection under numerous mixing conditions and impeller configurations. The numerical simulation of the moisture-rich food waste (approximated as a Newtonian carrot–orange soup is performed with ANSYS CFX v.15.0. The velocity and temperature field distribution of the homogenizer for various impeller rotational speeds are analyzed. It is anticipated that the developed model will help in the selection of a suitable impeller for efficient mixing of food waste in the homogenizer.

  6. Nuclear-Thermal Analysis of Fully Ceramic Microencapsulated Fuel via Two-Temperature Homogenized Model

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Nam Zin

    2013-01-01

    The FCM fuel is based on a proven safety philosophy that has been utilized operationally in very high temperature reactors (VHTRs). However, the FCM fuel consists of TRISO particles randomly dispersed in SiC matrix. The high heterogeneity in composition leads to difficulty in explicit thermal calculation of such a fuel. Therefore, an appropriate homogenization model becomes essential. In this paper, we apply the two-temperature homogenized model to thermal analysis of an FCM fuel. The model was recently proposed in order to provide more realistic temperature profiles in the fuel element in VHTRs. We applied the two-temperature homogenized model to FCM fuel. The two-temperature homogenized model was obtained by particle transport Monte Carlo calculation applied to the pellet region consisting of many coated particles uniformly dispersed in SiC matrix. Since this model gives realistic temperature profiles in the pellet (providing fuel-kernel temperature and SiC matrix temperature distinctly), it can be used for more accurate neutronics evaluation such as Doppler temperature feedback. The transient thermal calculation may be performed also more realistically with temperature-dependent homogenized parameters in various scenarios

  7. Developpement de techniques numeriques pour l'estimation, la modelisation et la prediction de proprietes thermodynamiques et structurales de systems metalliques a fort ordonnancement chimique

    Science.gov (United States)

    Harvey, Jean-Philippe

    In this work, the possibility to calculate and evaluate with a high degree of precision the Gibbs energy of complex multiphase equilibria for which chemical ordering is explicitly and simultaneously considered in the thermodynamic description of solid (short range order and long range order) and liquid (short range order) metallic phases is studied. The cluster site approximation (CSA) and the cluster variation method (CVM) are implemented in a new minimization technique of the Gibbs energy of multicomponent and multiphase systems to describe the thermodynamic behaviour of metallic solid solutions showing strong chemical ordering. The modified quasichemical model in the pair approximation (MQMPA) is also implemented in the new minimization algorithm presented in this work to describe the thermodynamic behaviour of metallic liquid solutions. The constrained minimization technique implemented in this work consists of a sequential quadratic programming technique based on an exact Newton’s method (i.e. the use of exact second derivatives in the determination of the Hessian of the objective function) combined to a line search method to identify a direction of sufficient decrease of the merit function. The implementation of a new algorithm to perform the constrained minimization of the Gibbs energy is justified by the difficulty to identify, in specific cases, the correct multiphase assemblage of a system where the thermodynamic behaviour of the equilibrium phases is described by one of the previously quoted models using the FactSage software (ex.: solid_CSA+liquid_MQMPA; solid1_CSA+solid2_CSA). After a rigorous validation of the constrained Gibbs energy minimization algorithm using several assessed binary and ternary systems found in the literature, the CVM and the CSA models used to describe the energetic behaviour of metallic solid solutions present in systems with key industrial applications such as the Cu-Zr and the Al-Zr systems are parameterized using fully consistent thermodynamic an structural data generated from a Monte Carlo (MC) simulator also implemented in the framework of this project. In this MC simulator, the modified embedded atom model in the second nearest neighbour formalism (MEAM-2NN) is used to describe the cohesive energy of each studied structure. A new Al-Zr MEAM-2NN interatomic potential needed to evaluate the cohesive energy of the condensed phases of this system is presented in this work. The thermodynamic integration (TI) method implemented in the MC simulator allows the evaluation of the absolute Gibbs energy of the considered solid or liquid structures. The original implementation of the TI method allowed us to evaluate theoretically for the first time all the thermodynamic mixing contributions (i.e., mixing enthalpy and mixing entropy contributions) of a metallic liquid (Cu-Zr and Al-Zr) and of a solid solution (face-centered cubic (FCC) Al-Zr solid solution) described by the MEAM-2NN. Thermodynamic and structural data obtained from MC and molecular dynamic simulations are then used to parameterize the CVM for the Al-Zr FCC solid solution and the MQMPA for the Al-Zr and the Cu-Zr liquid phase respectively. The extended thermodynamic study of these systems allow the introduction of a new type of configuration-dependent excess parameters in the definition of the thermodynamic function of solid solutions described by the CVM or the CSA. These parameters greatly improve the precision of these thermodynamic models based on experimental evidences found in the literature. A new parameterization approach of the MQMPA model of metallic liquid solutions is presented throughout this work. In this new approach, calculated pair fractions obtained from MC/MD simulations are taken into account as well as configuration-independent volumetric relaxation effects (regular like excess parameters) in order to parameterize precisely the Gibbs energy function of metallic melts. The generation of a complete set of fully consistent thermodynamic, physical and structural data for solid, liquid, and stoichiometric compounds and the subsequent parameterization of their respective thermodynamic model lead to the first description of the complete Al-Zr phase diagram in the range of composition [0 ≤ XZr ≤ 5 / 9] based on theoretical and fully consistent thermodynamic properties. MC and MD simulations are performed for the Al-Zr system to define for the first time the precise thermodynamic behaviour of the amorphous phase for its entire range of composition. Finally, all the thermodynamic models for the liquid phase, the FCC solid solution and the amorphous phase are used to define conditions based on thermodynamic and volumetric considerations that favor the amorphization of Al-Zr alloys.

  8. Homogenization of food samples for gamma spectrometry using tetramethylammonium hydroxide and enzymatic digestion

    International Nuclear Information System (INIS)

    Kimi Nishikawa; Abdul Bari; Abdul Jabbar Khan; Xin Li; Traci Menia; Semkow, T.M.

    2017-01-01

    We have developed a method of food sample preparation for gamma spectrometry involving the use of tetramethylammonium hydroxide (TMAH) and/or enzymes such as α-amylase or cellulase for sample homogenization. We demonstrated the effectiveness of this method using food matrices spiked with "6"0Co, "1"3"1I, "1"3"4","1"3"7Cs, and "2"4"1Am radionuclides, homogenized with TMAH (mixed salad, parmesan cheese, and ground beef); enzymes (α-amylase for bread, and cellulase for baked beans); or α-amylase followed by TMAH (cheeseburgers). Procedures were developed which are best compromises between the degree of homogenization, accuracy, speed, and minimizing laboratory equipment contamination. Based on calculated sample biases and z-scores, our results suggest that homogenization using TMAH and enzymes would be a useful method of sample preparation for gamma spectrometry samples during radiological emergencies. (author)

  9. Homogenization of long fiber reinforced composites including fiber bending effects

    DEFF Research Database (Denmark)

    Poulios, Konstantinos; Niordson, Christian Frithiof

    2016-01-01

    This paper presents a homogenization method, which accounts for intrinsic size effects related to the fiber diameter in long fiber reinforced composite materials with two independent constitutive models for the matrix and fiber materials. A new choice of internal kinematic variables allows...... of the reinforcing fibers is captured by higher order strain terms, resulting in an accurate representation of the micro-mechanical behavior of the composite. Numerical examples show that the accuracy of the proposed model is very close to a non-homogenized finite-element model with an explicit discretization...

  10. Numerical solutions of a three-point boundary value problem with an ...

    African Journals Online (AJOL)

    Numerical solutions of a three-point boundary value problem with an integral condition for a third-order partial differential equation by using Laplace transform method Solutions numeriques d'un probleme pour une classe d'equations differentielles d'ordr.

  11. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    International Nuclear Information System (INIS)

    Lee, C.C.; Lee, Y.J.; Tung, C.J.; Cheng, H.W.; Chao, T.C.

    2014-01-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R 50% ) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R 50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent R eq,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively. - Highlights: ► Proton dose simulation based on the MCNPX 2.6.0 in homogeneous and CT phantoms. ► CT number (HU) conversion to electron density based on Schneider's approach. ► Good agreement among MCNPX, GEANT4 and FLUKA codes in a homogeneous water phantom. ► Water equivalent R 50 in CT phantoms are compatible to those of NIST database

  12. Broken ergodicity in two-dimensional homogeneous magnetohydrodynamic turbulence

    International Nuclear Information System (INIS)

    Shebalin, John V.

    2010-01-01

    Two-dimensional (2D) homogeneous magnetohydrodynamic (MHD) turbulence has many of the same qualitative features as three-dimensional (3D) homogeneous MHD turbulence. These features include several ideal (i.e., nondissipative) invariants along with the phenomenon of broken ergodicity (defined as nonergodic behavior over a very long time). Broken ergodicity appears when certain modes act like random variables with mean values that are large compared to their standard deviations, indicating a coherent structure or dynamo. Recently, the origin of broken ergodicity in 3D MHD turbulence that is manifest in the lowest wavenumbers was found. Here, we study the origin of broken ergodicity in 2D MHD turbulence. It will be seen that broken ergodicity in ideal 2D MHD turbulence can be manifest in the lowest wavenumbers of a finite numerical model for certain initial conditions or in the highest wavenumbers for another set of initial conditions. The origins of broken ergodicity in an ideal 2D homogeneous MHD turbulence are found through an eigenanalysis of the covariance matrices of the probability density function and by an examination of the associated entropy functional. When the values of ideal invariants are kept fixed and grid size increases, it will be shown that the energy in a few large modes remains constant, while the energy in any other mode is inversely proportional to grid size. Also, as grid size increases, we find that broken ergodicity becomes manifest at more and more wavenumbers.

  13. Standard deviation index for stimulated Brillouin scattering suppression with different homogeneities.

    Science.gov (United States)

    Ran, Yang; Su, Rongtao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Si, Lei

    2016-05-10

    We present a new quantitative index of standard deviation to measure the homogeneity of spectral lines in a fiber amplifier system so as to find the relation between the stimulated Brillouin scattering (SBS) threshold and the homogeneity of the corresponding spectral lines. A theoretical model is built and a simulation framework has been established to estimate the SBS threshold when input spectra with different homogeneities are set. In our experiment, by setting the phase modulation voltage to a constant value and the modulation frequency to different values, spectral lines with different homogeneities can be obtained. The experimental results show that the SBS threshold increases negatively with the standard deviation of the modulated spectrum, which is in good agreement with the theoretical results. When the phase modulation voltage is confined to 10 V and the modulation frequency is set to 80 MHz, the standard deviation of the modulated spectrum equals 0.0051, which is the lowest value in our experiment. Thus, at this time, the highest SBS threshold has been achieved. This standard deviation can be a good quantitative index in evaluating the power scaling potential in a fiber amplifier system, which is also a design guideline in suppressing the SBS to a better degree.

  14. Conformity Index and Homogeneity Index of the Postoperative Whole Breast Radiotherapy.

    Science.gov (United States)

    Petrova, Deva; Smickovska, Snezana; Lazarevska, Emilija

    2017-10-15

    The treatment of breast cancer involves a multidisciplinary approach in which radiotherapy plays a key role. The conformity index and the homogeneity index are two analysis tools of a treatment plan using conformal radiotherapy. The purpose of this article is an analysis of these two parameters in the assessment of the treatment plans in 58 patients undergoing postoperative radiotherapy of the whole breast. All 58 patients participating in the study had a conservatively treated early-stage breast cancer. The treatment was performed using a standard regimen of fractionation in 25 fractions up to a total dose of 50 Gy. Dose-volume histograms were generated for both plans with and without segmental fields. Pair samples t-test was used. The technique with segmental fields allowed us more homogeneity distribution when compared to standard two tangential field techniques. The HI values were 1.08 ± 0.01 and 1.09 ± 0.01 for segment and technique with two tangential fields (p conformity and the homogeneity index are important tools in the analysis of the treatment plans during radiation therapy in patients with early-stage breast cancer. Adding segment fields in the administration of radiotherapy in patients with conservatively treated breast cancer can lead to improved dosage homogeneity and conformity.

  15. Effective production of bioenergy from marine Chlorella sp. by high-pressure homogenization

    Directory of Open Access Journals (Sweden)

    Woon Yong Choi

    2016-01-01

    Full Text Available This study investigated the use of a high-pressure homogenization process for the production of high shear stress on Chlorella sp. cells in order to effectively degrade their cell walls. The high-pressure homogenization process was conducted by using various pressure conditions in the range of 68.94–275.78 MPa with different numbers of repeated cycles. The optimal high-pressure homogenization pretreatment conditions were found to be two cycles at a pressure of 206.84 MPa, which provided an extraction yield of 20.35% (w/w total cellular lipids. In addition, based on the confocal microscopic images of Chlorella sp. cells stained by using nile red, the walls of Chlorella sp. cells were disrupted more effectively using this process when compared with the disruption achieved by conventional lipid-extraction processes. By using the by-product of Chlorella sp., 47.3% ethanol was obtained from Saccharomyces cerevisiae cultures. These results showed that the high-pressure homogenization process efficiently hydrolysed this marine resource for subsequent bioethanol production by using only water.

  16. Homogenization of locally resonant acoustic metamaterials towards an emergent enriched continuum.

    Science.gov (United States)

    Sridhar, A; Kouznetsova, V G; Geers, M G D

    This contribution presents a novel homogenization technique for modeling heterogeneous materials with micro-inertia effects such as locally resonant acoustic metamaterials. Linear elastodynamics is used to model the micro and macro scale problems and an extended first order Computational Homogenization framework is used to establish the coupling. Craig Bampton Mode Synthesis is then applied to solve and eliminate the microscale problem, resulting in a compact closed form description of the microdynamics that accurately captures the Local Resonance phenomena. The resulting equations represent an enriched continuum in which additional kinematic degrees of freedom emerge to account for Local Resonance effects which would otherwise be absent in a classical continuum. Such an approach retains the accuracy and robustness offered by a standard Computational Homogenization implementation, whereby the problem and the computational time are reduced to the on-line solution of one scale only.

  17. Homogeneous Catalysis with Metal Complexes Fundamentals and Applications

    CERN Document Server

    Duca, Gheorghe

    2012-01-01

    The book about homogeneous catalysis with metal complexes deals with the description of the reductive-oxidative, metal complexes  in a liquid phase (in polar solvents, mainly in water, and less in nonpolar solvents). The exceptional importance of the redox processes in chemical systems, in the reactions occuring in living organisms, the environmental processes, atmosphere, water, soil, and in industrial technologies (especially in food-processing industries) is discussed. The detailed practical aspects of the established regularities are explained for solving the specific practical tasks in various fields of industrial chemistry, biochemistry, medicine, analytical chemistry and ecological chemistry. The main scope of the book is the survey and systematization of the latest advances in homogeneous catalysis with metal complexes. It gives an overview of the research results and practical experience accumulated by the author during the last decade.

  18. On the time-homogeneous Ornstein-Uhlenbeck process in the foreign exchange rates

    Science.gov (United States)

    da Fonseca, Regina C. B.; Matsushita, Raul Y.; de Castro, Márcio T.; Figueiredo, Annibal

    2015-10-01

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein-Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein-Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions.

  19. A facile approach to manufacturing non-ionic surfactant nanodipsersions using proniosome technology and high-pressure homogenization.

    Science.gov (United States)

    Najlah, Mohammad; Hidayat, Kanar; Omer, Huner K; Mwesigwa, Enosh; Ahmed, Waqar; AlObaidy, Kais G; Phoenix, David A; Elhissi, Abdelbary

    2015-03-01

    In this study, a niosome nanodispersion was manufactured using high-pressure homogenization following the hydration of proniosomes. Using beclometasone dipropionate (BDP) as a model drug, the characteristics of the homogenized niosomes were compared with vesicles prepared via the conventional approach of probe-sonication. Particle size, zeta potential, and the drug entrapment efficiency were similar for both size reduction mechanisms. However, high-pressure homogenization was much more efficient than sonication in terms of homogenization output rate, avoidance of sample contamination, offering a greater potential for a large-scale manufacturing of noisome nanodispersions. For example, high-pressure homogenization was capable of producing small size niosomes (209 nm) using a short single-step of size reduction (6 min) as compared with the time-consuming process of sonication (237 nm in >18 min) and the BDP entrapment efficiency was 29.65% ± 4.04 and 36.4% ± 2.8. In addition, for homogenization, the output rate of the high-pressure homogenization was 10 ml/min compared with 0.83 ml/min using the sonication protocol. In conclusion, a facile, applicable, and highly efficient approach for preparing niosome nanodispersions has been established using proniosome technology and high-pressure homogenization.

  20. Distributions asymptotically homogeneous along the trajectories determined by one-parameter groups

    International Nuclear Information System (INIS)

    Drozhzhinov, Yurii N; Zav'yalov, Boris I

    2012-01-01

    We give a complete description of distributions that are asymptotically homogeneous (including the case of critical index of the asymptotic scale) along the trajectories determined by continuous multiplicative one-parameter transformation groups such that the real parts of all eigenvalues of the infinitesimal matrix are positive. To do this, we introduce and study special spaces of distributions. As an application of our results, we describe distributions that are homogeneous along such groups.

  1. WHAMP - waves in homogeneous, anisotropic, multicomponent plasmas

    International Nuclear Information System (INIS)

    Roennmark, K.

    1982-06-01

    In this report, a computer program which solves the dispersion relation of waves in a magnetized plasma is described. The dielectric tensor is derived using the kinetic theory of homogeneous plasmas with Maxwellian velocity distribution. Up to six different plasma components can be included in this version of the program, and each component is specified by its density, temperature, particle mass, anisotropy and drift velocity along the magnetic field. The program is thus applicable to a very wide class of plasmas, and the method should in general be useful whenever a homogeneous magnetized plasma can be approximated by a linear combination of Maxwellian components. The general theory underlying the program is outlined. It is shown that by introducing a Pade approximant for the plasma dispersion function Z, the infinite sums of modified Bessel functions which appear in the dielectric tensor may be reduced to a summable form. The Pade approximant is derived and the accuracy of the approximation is also discussed. The subroutines making up the program are described. (Author)

  2. Unified double- and single-sided homogeneous Green's function representations

    Science.gov (United States)

    Wapenaar, Kees; van der Neut, Joost; Slob, Evert

    2016-06-01

    In wave theory, the homogeneous Green's function consists of the impulse response to a point source, minus its time-reversal. It can be represented by a closed boundary integral. In many practical situations, the closed boundary integral needs to be approximated by an open boundary integral because the medium of interest is often accessible from one side only. The inherent approximations are acceptable as long as the effects of multiple scattering are negligible. However, in case of strongly inhomogeneous media, the effects of multiple scattering can be severe. We derive double- and single-sided homogeneous Green's function representations. The single-sided representation applies to situations where the medium can be accessed from one side only. It correctly handles multiple scattering. It employs a focusing function instead of the backward propagating Green's function in the classical (double-sided) representation. When reflection measurements are available at the accessible boundary of the medium, the focusing function can be retrieved from these measurements. Throughout the paper, we use a unified notation which applies to acoustic, quantum-mechanical, electromagnetic and elastodynamic waves. We foresee many interesting applications of the unified single-sided homogeneous Green's function representation in holographic imaging and inverse scattering, time-reversed wave field propagation and interferometric Green's function retrieval.

  3. Homogeneous dose distribution of electrons obtained from linear accelerators equipped with beam scanners

    International Nuclear Information System (INIS)

    Borchardt, D.; Cwiekala, M.; Schroeder, U.G.

    1985-01-01

    Homogeneous distribution of electrons used for therapeutic purposes and obtained from accelerators, is achieved by means of Potter-Bucky diaphragms or by repeated, staggered, sawtooth-shaped sweeping movements of the electron beam (scanning) over the radiation field. The repetition of the scanning process (number of scans) can result in long measurement times for achieving a sufficiently homogeneous, dosimetrically adequate distribution of the electrons. This ''time problem'' makes it imperative to achieve good homogeneity while keeping the number of scans as low as possible. To solve the problem, the scanning movement of the electron beam is simulated by a computer programme and the independence of the homogeneity of the irradiation field and number of scans is investigated. Since changing the ratio of the two deflection rates exercises a significant influence, it is mandatory in dosimetry to pay close attention to strict observance of the deflection rates. (orig.) [de

  4. Two-dimensional arbitrarily shaped acoustic cloaks composed of homogeneous parts

    Science.gov (United States)

    Li, Qi; Vipperman, Jeffrey S.

    2017-10-01

    Acoustic cloaking is an important application of acoustic metamaterials. Although the topic has received much attention, there are a number of areas where contributions are needed. In this paper, a design method for producing acoustic cloaks with arbitrary shapes that are composed of homogeneous parts is presented. The cloak is divided into sections, each of which, in turn, is further divided into two parts, followed by the application of transformation acoustics to derive the required properties for cloaking. With the proposed mapping relations, the properties of each part of the cloak are anisotropic but homogeneous, which can be realized using two alternating layers of homogeneous and isotropic materials. A hexagonal and an irregular cloak are presented as design examples. The full wave simulations using COMSOL Multiphysics finite element software show that the cloaks function well at reducing reflections and shadows. The variation of the cloak properties is investigated as a function of three important geometric parameters used in the transformations. A balance can be found between cloaking performance and materials properties that are physically realizable.

  5. Electromagnetic Radiation in a Uniformly Moving, Homogeneous Medium

    DEFF Research Database (Denmark)

    Johannsen, Günther

    1972-01-01

    A new method of treating radiation problems in a uniformly moving, homogeneous medium is presented. A certain transformation technique in connection with the four-dimensional Green's function method makes it possible to elaborate the Green's functions of the governing differential equations...

  6. Influence of Interspecific Competition and Landscape Structure on Spatial Homogenization of Avian Assemblages

    Science.gov (United States)

    Robertson, Oliver J.; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or ‘winners’, is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or ‘winners’, can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  7. Improvement of the homogeneity of atomized particles dispersed in high uranium density research reactor fuels

    International Nuclear Information System (INIS)

    Kim, Chang-Kyu; Kim, Ki-Hwan; Park, Jong-Man; Lee, Yoon-Sang; Lee, Don-Bae; Sohn, Woong-Hee; Hong, Soon-Hyung

    1998-01-01

    A study on improving the homogeneous dispersion of atomized spherical particles in fuel meats has been performed in connection with the development of high uranium density fuel. In comparing various mixing methods, the better homogeneity of the mixture could be obtained as in order of Spex mill, V-shape tumbler mixer, and off-axis rotating drum mixer. The Spex mill mixer required some laborious work because of its small capacity per batch. Trough optimizing the rotating speed parameter for the V-shape tumbler mixer, almost the same homogeneity as with the Spex mill could be obtained. The homogeneity of the extruded fuel meats appeared to improve through extrusion. All extruded fuel meats with U 3 Si powder of 50-volume % had fairly smooth surfaces. The homogeneity of fuel meats by V-shaped tumbler mixer revealed to be fairly good on micrographs. (author)

  8. Do Algorithms Homogenize Students' Achievements in Secondary School Better than Teachers' Tracking Decisions?

    Science.gov (United States)

    Klapproth, Florian

    2015-01-01

    Two objectives guided this research. First, this study examined how well teachers' tracking decisions contribute to the homogenization of their students' achievements. Second, the study explored whether teachers' tracking decisions would be outperformed in homogenizing the students' achievements by statistical models of tracking decisions. These…

  9. Microsegregation and homogenization in U-Nb alloy

    International Nuclear Information System (INIS)

    Leal, J. Fernando; Nogueira, R.A.; Ambrozio Filho, F.

    1987-01-01

    Microsegregation results in U-4 w t% Nb alloys casted in nonconsumable electrode arc furnace are presented. The microsegregation is studied qualitatively by optical microscopy and quantitatively by electron microprobe. The degreee of homogenetization has been measured after 800 0 C heat treatments. The times required for homogeneization of the alloys are also discussed. (author) [pt

  10. Sampling and Homogenization Strategies Significantly Influence the Detection of Foodborne Pathogens in Meat.

    Science.gov (United States)

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-01-01

    Efficient preparation of food samples, comprising sampling and homogenization, for microbiological testing is an essential, yet largely neglected, component of foodstuff control. Salmonella enterica spiked chicken breasts were used as a surface contamination model whereas salami and meat paste acted as models of inner-matrix contamination. A systematic comparison of different homogenization approaches, namely, stomaching, sonication, and milling by FastPrep-24 or SpeedMill, revealed that for surface contamination a broad range of sample pretreatment steps is applicable and loss of culturability due to the homogenization procedure is marginal. In contrast, for inner-matrix contamination long treatments up to 8 min are required and only FastPrep-24 as a large-volume milling device produced consistently good recovery rates. In addition, sampling of different regions of the spiked sausages showed that pathogens are not necessarily homogenously distributed throughout the entire matrix. Instead, in meat paste the core region contained considerably more pathogens compared to the rim, whereas in the salamis the distribution was more even with an increased concentration within the intermediate region of the sausages. Our results indicate that sampling and homogenization as integral parts of food microbiology and monitoring deserve more attention to further improve food safety.

  11. Fault Diagnosis of Supervision and Homogenization Distance Based on Local Linear Embedding Algorithm

    Directory of Open Access Journals (Sweden)

    Guangbin Wang

    2015-01-01

    Full Text Available In view of the problems of uneven distribution of reality fault samples and dimension reduction effect of locally linear embedding (LLE algorithm which is easily affected by neighboring points, an improved local linear embedding algorithm of homogenization distance (HLLE is developed. The method makes the overall distribution of sample points tend to be homogenization and reduces the influence of neighboring points using homogenization distance instead of the traditional Euclidean distance. It is helpful to choose effective neighboring points to construct weight matrix for dimension reduction. Because the fault recognition performance improvement of HLLE is limited and unstable, the paper further proposes a new local linear embedding algorithm of supervision and homogenization distance (SHLLE by adding the supervised learning mechanism. On the basis of homogenization distance, supervised learning increases the category information of sample points so that the same category of sample points will be gathered and the heterogeneous category of sample points will be scattered. It effectively improves the performance of fault diagnosis and maintains stability at the same time. A comparison of the methods mentioned above was made by simulation experiment with rotor system fault diagnosis, and the results show that SHLLE algorithm has superior fault recognition performance.

  12. Influence of Homogenization on Microstructural Response and Mechanical Property of Al-Cu-Mn Alloy.

    Science.gov (United States)

    Wang, Jian; Lu, Yalin; Zhou, Dongshuai; Sun, Lingyan; Li, Renxing; Xu, Wenting

    2018-05-29

    The evolution of the microstructures and properties of large direct chill (DC)-cast Al-Cu-Mn alloy ingots during homogenization was investigated. The results revealed that the Al-Cu-Mn alloy ingots had severe microsegregation and the main secondary phase was Al₂Cu, with minimal Al₇Cu₂Fe phase. Numerous primary eutectic phases existed in the grain boundary and the main elements were segregated at the interfaces along the interdendritic region. The grain boundaries became discontinuous, residual phases were effectively dissolved into the matrix, and the segregation degree of all elements was reduced dramatically during homogenization. In addition, the homogenized alloys exhibited improved microstructures with finer grain size, higher number density of dislocation networks, higher density of uniformly distributed θ' or θ phase (Al₂Cu), and higher volume fraction of high-angle grain boundaries compared to the nonhomogenized samples. After the optimal homogenization scheme treated at 535 °C for 10 h, the tensile strength and elongation% were about 24 MPa, 20.5 MPa, and 1.3% higher than those of the specimen without homogenization treatment.

  13. Microstructural evolution during the homogenization heat treatment of 6XXX and 7XXX aluminum alloys

    Science.gov (United States)

    Priya, Pikee

    Homogenization heat treatment of as-cast billets is an important step in the processing of aluminum extrusions. Microstructural evolution during homogenization involves elimination of the eutectic morphology by spheroidisation of the interdendritic phases, minimization of the microsegregation across the grains through diffusion, dissolution of the low-melting phases, which enhances the surface finish of the extrusions, and precipitation of nano-sized dispersoids (for Cr-, Zr-, Mn-, Sc-containing alloys), which inhibit grain boundary motion to prevent recrystallization. Post-homogenization cooling reprecipitates some of the phases, changing the flow stress required for subsequent extrusion. These precipitates, however, are deleterious for the mechanical properties of the alloy and also hamper the age-hardenability and are hence dissolved during solution heat treatment. Microstructural development during homogenization and subsequent cooling occurs both at the length scale of the Secondary Dendrite Arm Spacing (SDAS) in micrometers and dispersoids in nanometers. Numerical tools to simulate microstructural development at both the length scales have been developed and validated against experiments. These tools provide easy and convenient means to study the process. A Cellular Automaton-Finite Volume-based model for evolution of interdendritic phases is coupled with a Particle Size Distribution-based model for precipitation of dispersoids across the grain. This comprehensive model has been used to study the effect of temperature, composition, as-cast microstructure, and cooling rates during post-homogenization quenching on microstructural evolution. The numerical study has been complimented with experiments involving Scanning Electron Microscopy, Energy Dispersive Spectroscopy, X-Ray Diffraction and Differential Scanning Calorimetry and a good agreement has with numerical results has been found. The current work aims to study the microstructural evolution during

  14. More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples

    Science.gov (United States)

    Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.

    2017-01-01

    Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254

  15. Micro segregation and homogenization treatments of uranium-niobium alloys (U-Nb)

    International Nuclear Information System (INIS)

    Leal, Jose Fernando

    1988-01-01

    In the following sections micro segregation results in 0-3,6 wt% Nb and U-6,1 wt% Nb alloys casted in no consumable electrode arc furnace are presented. The micro segregation is studied qualitatively by optical microscopy and quantitatively by electron microprobe. The degree of homogenization has been measured after 800 and 850 deg C heat treatments in tubular resistive furnace. The microstructures after heat treatments are quantitatively analysed to check effects on the casting structures, mainly the variations in solute along the dendrite arm spacing. Some solidification phenomena are then discussed on reference to theoretical models of dendritic solidification , including microstructure and micro segregation. The experimental results are compared to theoretical on basis of initial and residual micro segregation after homogenization treatments. The times required for homogenization of the alloys are also discussed in function of the micro segregation from casting structures and the temperatures of the treatments. (author)

  16. [Methods for enzymatic determination of triglycerides in liver homogenates].

    Science.gov (United States)

    Höhn, H; Gartzke, J; Burck, D

    1987-10-01

    An enzymatic method is described for the determination of triacylglycerols in liver homogenate. In contrast to usual methods, higher reliability and selectivity are achieved by omitting the extraction step.

  17. Heterotic strings on homogeneous spaces

    International Nuclear Information System (INIS)

    Israel, D.; Kounnas, C.; Orlando, D.; Petropoulos, P.M.

    2005-01-01

    We construct heterotic string backgrounds corresponding to families of homogeneous spaces as exact conformal field theories. They contain left cosets of compact groups by their maximal tori supported by NS-NS 2-forms and gauge field fluxes. We give the general formalism and modular-invariant partition functions, then we consider some examples such as SU(2)/U(1)∝S 2 (already described in a previous paper) and the SU(3)/U(1) 2 flag space. As an application we construct new supersymmetric string vacua with magnetic fluxes and a linear dilaton. (Abstract Copyright [2005], Wiley Periodicals, Inc.)

  18. Lambda-Cyhalothrin Nanosuspension Prepared by the Melt Emulsification-High Pressure Homogenization Method

    OpenAIRE

    Pan, Zhenzhong; Cui, Bo; Zeng, Zhanghua; Feng, Lei; Liu, Guoqiang; Cui, Haixin; Pan, Hongyu

    2015-01-01

    The nanosuspension of 5% lambda-cyhalothrin with 0.2% surfactants was prepared by the melt emulsification-high pressure homogenization method. The surfactants composition, content, and homogenization process were optimized. The anionic surfactant (1-dodecanesulfonic acid sodium salt) and polymeric surfactant (maleic rosin-polyoxypropylene-polyoxyethylene ether sulfonate) screened from 12 types of commercially common-used surfactants were used to prepare lambda-cyhalothrin nanosuspension with ...

  19. Subspace identification of distributed clusters of homogeneous systems

    NARCIS (Netherlands)

    Yu, C.; Verhaegen, M.H.G.

    2017-01-01

    This note studies the identification of a network comprised of interconnected clusters of LTI systems. Each cluster consists of homogeneous dynamical systems, and its interconnections with the rest of the network are unmeasurable. A subspace identification method is proposed for identifying a single

  20. Efficiency measures for a non-homogeneous group of family farmers

    Directory of Open Access Journals (Sweden)

    Eliane Gonçalves Gomes

    2012-12-01

    Full Text Available DEA models assume the homogeneity of the units under evaluation (DMUs. However,in some cases, the DMUs use different production technologies. In such cases, they should be evaluated separately. In this paper we evaluate the efficiency of family farmers from the Brazilian Eastern Amazon, who use different agricultural production systems. We propose an alternative algorithm to assess the global efficiency, taking into account the non-homogeneity. The results show that the farmers that use the classical technology are more efficient than those considered "environmental friendly", as we took into account only the economic point of view.

  1. A homogeneous catalogue of quasar candidates found with slitless spectroscopy

    International Nuclear Information System (INIS)

    Beauchemin, M.; Borra, E.F.; Edwards, G.

    1990-01-01

    This paper gives a list of all quasar candidates obtained from an automated computer search performed on 11 grens plates. The description of the main characteristics of the survey is given along with the latest improvements in the selection techniques. Particular attention has been paid to understanding and quantifying selection effects. This allows the construction of homogeneous samples having well-understood characteristics. The noteworthy aspect of our homogenization process is the correction that we apply to our probability classes in order to take into account the signal-to-noise differences; at a given magnitude, among plates of different limiting magnitudes. (author)

  2. The Transition to Large-scale Cosmic Homogeneity in the WiggleZ Dark Energy Survey

    Science.gov (United States)

    Scrimgeour, Morag; Davis, T.; Blake, C.; James, B.; Poole, G. B.; Staveley-Smith, L.; Dark Energy Survey, WiggleZ

    2013-01-01

    The most fundamental assumption of the standard cosmological model (ΛCDM) is that the universe is homogeneous on large scales. This is clearly not true on small scales, where clusters and voids exist, and some studies seem to suggest that galaxies follow a fractal distribution up to very large scales 200 h-1 Mpc or more), whereas the ΛCDM model predicts transition to homogeneity at scales of ~100 h-1 Mpc. Any cosmological measurements made below the scale of homogeneity (such as the power spectrum) could be misleading, so it is crucial to measure the scale of homogeneity in the Universe. We have used the WiggleZ Dark Energy Survey to make the largest volume measurement to date of the transition to homogeneity in the galaxy distribution. WiggleZ is a UV-selected spectroscopic survey of ~200,000 luminous blue galaxies up to z=1, made with the Anglo-Australian Telescope. We have corrected for survey incompleteness using random catalogues that account for the various survey selection criteria, and tested the robustness of our results using a suite of fractal mock catalogues. The large volume and depth of WiggleZ allows us to probe the transition of the galaxy distribution to homogeneity on large scales and over several epochs, and see if this is consistent with a ΛCDM prediction.

  3. Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems

    Science.gov (United States)

    Leuschner, Matthias; Fritzen, Felix

    2017-11-01

    Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.

  4. Producing a lycopene nanodispersion: Formulation development and the effects of high pressure homogenization.

    Science.gov (United States)

    Shariffa, Y N; Tan, T B; Uthumporn, U; Abas, F; Mirhosseini, H; Nehdi, I A; Wang, Y-H; Tan, C P

    2017-11-01

    The aim of this study was to develop formulations to produce lycopene nanodispersions and to investigate the effects of the homogenization pressure on the physicochemical properties of the lycopene nanodispersion. The samples were prepared by using emulsification-evaporation technique. The best formulation was achieved by dispersing an organic phase (0.3% w/v lycopene dissolved in dichloromethane) in an aqueous phase (0.3% w/v Tween 20 dissolved in deionized water) at a ratio of 1:9 by using homogenization process. The increased level of homogenization pressure to 500bar reduced the particle size and lycopene concentration significantly (phomogenization pressure (700-900bar) resulted in large particle sizes with high dispersibility. The zeta potential and turbidity of the lycopene nanodispersion were significantly influenced by the homogenization pressure. The results from this study provided useful information for producing small-sized lycopene nanodispersions with a narrow PDI and good stability for application in beverage products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Homogenization of some radiative heat transfer models: application to gas-cooled reactor cores

    International Nuclear Information System (INIS)

    El Ganaoui, K.

    2006-09-01

    In the context of homogenization theory we treat some heat transfer problems involving unusual (according to the homogenization) boundary conditions. These problems are defined in a solid periodic perforated domain where two scales (macroscopic and microscopic) are to be taken into account and describe heat transfer by conduction in the solid and by radiation on the wall of each hole. Two kinds of radiation are considered: radiation in an infinite medium (non-linear problem) and radiation in cavity with grey-diffuse walls (non-linear and non-local problem). The derived homogenized models are conduction problems with an effective conductivity which depend on the considered radiation. Thus we introduce a framework (homogenization and validation) based on mathematical justification using the two-scale convergence method and numerical validation by simulations using the computer code CAST3M. This study, performed for gas cooled reactors cores, can be extended to other perforated domains involving the considered heat transfer phenomena. (author)

  6. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    Science.gov (United States)

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  7. Homogenization of a storage and/or disposal site in an underground damage or fractured medium

    International Nuclear Information System (INIS)

    Khvoenkova, N.

    2007-07-01

    The aim of this work was to model the flow and the transport of a radionuclide in a fractured rock. In order to be able to simulate numerically these phenomena in an industrial context, it has been chosen to apply the homogenization method. The theoretical study has consisted in 1)determining a microscopic model in the fractured medium 2)homogenizing the microscopic model. In this study, two media have been studied: a granitic medium and a calcareous medium. With the obtained experimental data, six possible microscopic models have been deduced for each type of medium and in terms of the choice of the fracturing (thin or thick) and of the relation between the porosities and the delay coefficients. With the homogenization, three types of exchange of pollutant between the fractures and the porous blocks have been revealed: 1)the instantaneous exchange for which the presence of the porous blocks has no influence on the global behaviour of the system 2)the instantaneous exchange for which the porous blocks absorb a non-negligible quantity of pollutant. This influence is only determined by the fractures system 3)the non-instantaneous exchange. These homogenized models have been numerically studied (resolution with the Cast3M code). The simulation of the homogenized models has given results similar to those of the direct models. Moreover, the study of the homogenized diffusion tensor has shown that the homogenized model takes into account the dispersion produced by the fractures system. By all these results, it can be concluded that the risk estimation of the contamination of the fractured rock is possible for long times by the use of homogenized models. (O.M.)

  8. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    Science.gov (United States)

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms.

  9. Homogenization of the critically spectral equation in neutron transport

    International Nuclear Information System (INIS)

    Allaire, G.; Paris-6 Univ., 75; Bal, G.

    1998-01-01

    We address the homogenization of an eigenvalue problem for the neutron transport equation in a periodic heterogeneous domain, modeling the criticality study of nuclear reactor cores. We prove that the neutron flux, corresponding to the first and unique positive eigenvector, can be factorized in the product of two terms, up to a remainder which goes strongly to zero with the period. On terms is the first eigenvector of the transport equation in the periodicity cell. The other term is the first eigenvector of a diffusion equation in the homogenized domain. Furthermore, the corresponding eigenvalue gives a second order corrector for the eigenvalue of the heterogeneous transport problem. This result justifies and improves the engineering procedure used in practice for nuclear reactor cores computations. (author)

  10. Homogenization of the critically spectral equation in neutron transport

    Energy Technology Data Exchange (ETDEWEB)

    Allaire, G. [CEA Saclay, 91 - Gif-sur-Yvette (France). Dept. de Mecanique et de Technologie]|[Paris-6 Univ., 75 (France). Lab. d' Analyse Numerique; Bal, G. [Electricite de France (EDF), 92 - Clamart (France). Direction des Etudes et Recherches

    1998-07-01

    We address the homogenization of an eigenvalue problem for the neutron transport equation in a periodic heterogeneous domain, modeling the criticality study of nuclear reactor cores. We prove that the neutron flux, corresponding to the first and unique positive eigenvector, can be factorized in the product of two terms, up to a remainder which goes strongly to zero with the period. On terms is the first eigenvector of the transport equation in the periodicity cell. The other term is the first eigenvector of a diffusion equation in the homogenized domain. Furthermore, the corresponding eigenvalue gives a second order corrector for the eigenvalue of the heterogeneous transport problem. This result justifies and improves the engineering procedure used in practice for nuclear reactor cores computations. (author)

  11. Determination of trace elements in dried sea-plant homogenate (SP-M-1) and in dried copepod homogenate (MA-A-1) by means of neutron activation analysis

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Bruin, M. de.

    1977-07-01

    Two marine environmental reference materials were investigated in an intercalibration study of trace elements. According to the specifications from the International Laboratory of Marine Radioactivity at Monaco two samples, a sea-plant homogenate and a copepod homogenate, were analysed by neutron activation analysis. The results of the trace-element analyses were based on dry weight. The moisture content was measured on separate aliquots. For the intercalibration study the following elements were listed as elements of primary interest: mercury, cadmium, lead, manganese, zinc, copper, chromium, silver, iron and nickel. For the 14 elements normally analysed with the routine destructive method, the element gold could not be measured in the two marine samples. This was due to the high residual bromine-82 activity in fraction 13, which contains gold that distills over. With the nondestructive method, nine elements could be assessed, of which only three coincide with the 14 elements of the destructive method. A survey of all measured (trace) elements is presented. The 20 (trace) elements measured in the sea-plant homogenate and in the copepod homogenate comprise 8 out of the 10 trace elements of primary interest, all 5 trace elements of secondary interest (arsenic, cobalt, antimony, selenium and vanadium), and 5 additional (trace) elements. The trace-element determination in both marine biological materials via the destructive procedure was carried out in twelve-fold. In the nondestructive procedure quadruple measurements were performed. For all trace-element levels analysed an average value was calculated

  12. A Test for Parameter Homogeneity in CO2Panel EKC Estimations

    International Nuclear Information System (INIS)

    Dijkgraaf, E.; Vollebergh, H.R.J.

    2005-01-01

    This paper casts doubt on empirical results based on panel estimations of an 'inverted-U' relationship between per capita GDP and pollution. Using a new dataset for OECD countries on carbon dioxide emissions for the period 1960-1997, we find that the crucial assumption of homogeneity across countries is problematic. Decisively rejected are model specifications that feature even weaker homogeneity assumptions than are commonly used. Furthermore, our results challenge the existence of an overall Environmental Kuznets Curve for carbon dioxide emissions

  13. Bridging heterogeneous and homogeneous catalysis concepts, strategies, and applications

    CERN Document Server

    Li, Can

    2014-01-01

    This unique handbook fills the gap in the market for an up-to-date work that links both homogeneous catalysis applied to organic reactions and catalytic reactions on surfaces of heterogeneous catalysts.

  14. Properties of subvisible cirrus clouds formed by homogeneous freezing

    Directory of Open Access Journals (Sweden)

    B. Kärcher

    2002-01-01

    Full Text Available Number concentrations and mean sizes of ice crystals and derived microphysical and optical properties of subvisible cirrus clouds (SVCs formed by homogeneous freezing of supercooled aerosols are investigated as a function of temperature and updraft speed of adiabatically ascending air parcels. The properties of such clouds are insensitive to variations of the aerosol number and size distribution. Based on criteria constraining the optical extinction, sedimentation time, and existence time of SVCs, longer-lived (>10min clouds, capable of exerting a measurable radiative or chemical impact, are generated within a narrow range of updraft speeds below 1-2cm s-1 at temperatures below about 215K, with concentrations of ice crystals not exceeding 0.1cm-3. The clouds do not reach an equilibrium state because the ice crystals sediment out of the formation layer typically before the supersaturation is removed. Two important conclusions emerge from this work. First, the above characteristics of SVCs may provide an explanation for why SVCs are more common in the cold tropical than in the warmer midlatitude tropopause region. Second, it seems likely that a limited number (-3 of effective heterogeneous freezing nuclei that nucleate ice below the homogeneous freezing threshold can control the formation and properties of SVCs, although homogeneous freezing nuclei are far more abundant.

  15. Formulae and Bounds connected to Optimal Design and Homogenization of Partial Differential Operators and Integral Functionals

    Energy Technology Data Exchange (ETDEWEB)

    Lukkassen, D.

    1996-12-31

    When partial differential equations are set up to model physical processes in strongly heterogeneous materials, effective parameters for heat transfer, electric conductivity etc. are usually required. Averaging methods often lead to convergence problems and in homogenization theory one is therefore led to study how certain integral functionals behave asymptotically. This mathematical doctoral thesis discusses (1) means and bounds connected to homogenization of integral functionals, (2) reiterated homogenization of integral functionals, (3) bounds and homogenization of some particular partial differential operators, (4) applications and further results. 154 refs., 11 figs., 8 tabs.

  16. Study of microinstabilities due to an anisotropic velocity distribution function of the particles of a homogeneous plasma; Etude des microinstabilities liees a l'anisotropie de la fonction de distribution des vitesses des particules d'un plasma homogene

    Energy Technology Data Exchange (ETDEWEB)

    Hennion, F [Commissariat a l' Energie Atomique, Fontenay-aux-Roses (France). Centre d' Etudes Nucleaires

    1966-06-01

    A study is made of instabilities in a plasma with an ion velocity distribution function of the form: f{sub oi} = 1 / (2*{pi}*{alpha}{sub p}e{sub i}*{alpha}{sub p}a{sub i}) * {lambda}({nu}{sub p}e - {alpha}{sub p}e{sub i}) * e{sup -}(v{sub pa2}/{alpha}{sub pai2}). The plasma is assumed to have finite dimensions limited by infinitely conductive boundary surfaces. A theoretical and numerical analysis of marginal stability locates the regions of stability as a function of several parameters; i.e. plasma length, ion anisotropy ({tau}) and electron temperature (T{sub e}). A limiting plasma length is found, below which the plasma is stable regardless of its density. For the parameters of the injection experiment M.M.I.I. at Fontenay-aux-roses it is found that the type of instabilities studied here should not occur. (author) [French] L'etude est faite en choisissant une fonction de distribution des ions de la forme f{sub oi} = 1 / (2*{pi}*{alpha}{sub p}e{sub i}*{alpha}{sub p}a{sub i}) * {lambda}({nu}{sub p}e - {alpha}{sub p}e{sub i}) * e{sup -}(v{sub pa2}/{alpha}{sub pai2}) et en supposant une conductivite infinie sur les limites du plasma de dimensions finies. L'etude theorique et numerique de la stabilite marginale determine les domaines de stabilite qui sont etudies en fonction de plusieurs parametres: longueur du plasma, anisotropie des ions ({tau}), temperature electronique (T{sub e}). Il apparait une longueur limite du plasma au-dessous de laquelle le plasma est stable, independemment de la densite. L'application faite avec les valeurs des parametres de l'experience d'injection M.M.I.I, a Fontenay-aux-Roses permet de conclure a la non existence dans cet appareil du type d'instabilite etudie ici. (auteur)

  17. Homogeneous optical cloak constructed with uniform layered structures

    DEFF Research Database (Denmark)

    Zhang, Jingjing; Liu, Liu; Luo, Yu

    2011-01-01

    , the majority of the invisibility cloaks reported so far have a spatially varying refractive index which requires complicated design processes. Besides, the size of the hidden object is usually small relative to that of the cloak device. Here we report the experimental realization of a homogenous invisibility...

  18. Characterization of herbal powder blends homogeneity using near-infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    Wenlong Li

    2014-11-01

    Full Text Available Homogeneity of powder blend is essential to obtain uniform contents for the tablets and capsules. Near-infrared (NIR spectroscopy with fiber-optic probe was used as an on-line technique for monitoring the homogeneity of pharmaceutical blend during the blending process instead of the traditional techniques, such as high performance liquid chromatograph (HPLC method. In this paper NIRS with a SabIR diffuse reflectance fiber-optic probe was used to monitor the blending process of coptis powder and lactose (excipient with different contents, and further qualitative methods, like similarity, moving block of standard deviation and mean square were used for calculation purposes with the collected spectra after the pretreatment of multiplicative signal correction (MSC and second derivative. Correlation spectrum was used for the wavelength selection. Four different coptis were blended with lactose separately to validate the proposed method, and the blending process of "liu wei di huang" pill was also simulated in bottles to verify this method on multiple herbal blends. The overall results suggest that NIRS is a simple, effective and noninvasive technique can be successfully applied to the determination of homogeneity in the herbal blend.

  19. Application of INAA complementary gamma ray photopeaks to homogeneity study of candidate reference materials

    International Nuclear Information System (INIS)

    Moreira, Edson G.; Vasconcellos, Marina B.A.; Lima, Ana P.S.; Catharino, Marilia G.M.; Maihara, Vera A.; Saiki, Mitiko

    2009-01-01

    Characterization and certification of reference materials, RMs, is a complex task involving many steps. One of them is the homogeneity testing to assure that key property values will not present variation among RM bottles. Good precision is the most important figure of merit of an analytical technique to allow it to be used in the homogeneity testing of candidate RMs. Due to its inherent characteristics, Instrumental Neutron Activation Analysis, INAA, is an analytical technique of choice for homogeneity testing. Problems with sample digestion and contamination from reagents are not an issue in INAA, as solid samples are analyzed directly. For element determination via INAA, the activity of a suitable gamma ray decay photopeak for an element is chosen and it is compared to the activity of a standard of the element. An interesting possibility is the use of complementary gamma ray photopeaks (for the elements that present them) to confirm the homogeneity test results for an element. In this study, an investigation of the use of the complementary gamma ray photopeaks of 110 mAg, 82 Br, 60 Co, 134 Cs, 152 Eu, 59 Fe, 140 La, 233 Pa (for Th determination), 46 Sc and 75 Se radionuclides was undertaken in the between bottle homogeneity study of a mussel candidate RM under preparation at IPEN - CNEN/SP. Although some photopeaks led to biased element content results, the use of complementary gamma ray photopeaks proved to be helpful in supporting homogeneity study conclusions of new RMs. (author)

  20. Process to produce homogenized reactor fuels

    International Nuclear Information System (INIS)

    Hart, P.E.; Daniel, J.L.; Brite, D.W.

    1980-01-01

    The fuels consist of a mixture of PuO 2 and UO 2 . In order to increase the homogeneity of mechanically mixed fuels the pellets are sintered in a hydrogen atmosphere with a sufficiently low oxygen potential. This results in a reduction of Pu +4 to Pu +3 . By the reduction process water vapor is obtained increasing the pressure within the PuO 2 particles and causing PuO 2 to be pressed into the uranium oxide structure. (DG) [de

  1. The design of coils for the production of high homogeneous fields; Calcul des bobinages pour la production de champs magnetiques tres homogenes

    Energy Technology Data Exchange (ETDEWEB)

    Desportes, H [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1964-07-01

    The discovery of type II superconductors has considerably increased the possibilities of air-core coils, in particular with regard to the production of high homogeneous fields. The design of such magnets,calls for elaborate calculations which, in practise, can only be carried out on computers. The present report describes a complete set of programs for the calculation, in the case of cylindrical systems, of the magnetic field components at any point, the lines of flux, the forces, the self and mutual inductances, as well as the design of compensated coils for the production of high homogeneous fields. These programs have been employed for the calculation of two magnets which are described in detail. (author) [French] L'interet des bobines sans fer s'est considerablement accru depuis l'apparition recente des supraconducteurs de la deuxieme espece, en particulier pour la realisation d'aimants a champ tres homogene. Le calcul de tels bobinages fait appel a des methodes complexes dont l'execution pratique necessite l'emploi de machines a calculer. Le present rapport decrit un ensemble de programmes permettant de calculer, dans le cas de systemes de revolution de structure quelconque, le champ dans tout l'espace, les lignes de force du champ, les efforts electromagnetiques, les selfs et mutuelles, et de determiner des enroulements de compensation destines a uniformiser le champ. Ces programmes ont servi au calcul de deux aimants particuliers dont les caracteristiques detaillees sont fournies a titre d'application. (auteur)

  2. Self-formed waterfall plunge pools in homogeneous rock

    Science.gov (United States)

    Scheingross, Joel S.; Lo, Daniel Y.; Lamb, Michael P.

    2017-01-01

    Waterfalls are ubiquitous, and their upstream propagation can set the pace of landscape evolution, yet no experimental studies have examined waterfall plunge pool erosion in homogeneous rock. We performed laboratory experiments, using synthetic foam as a bedrock simulant, to produce self-formed waterfall plunge pools via particle impact abrasion. Plunge pool vertical incision exceeded lateral erosion by approximately tenfold until pools deepened to the point that the supplied sediment could not be evacuated and deposition armored the pool bedrock floor. Lateral erosion of plunge pool sidewalls continued after sediment deposition, but primarily at the downstream pool wall, which might lead to undermining of the plunge pool lip, sediment evacuation, and continued vertical pool floor incision in natural streams. Undercutting of the upstream pool wall was absent, and our results suggest that vertical drilling of successive plunge pools is a more efficient waterfall retreat mechanism than the classic model of headwall undercutting and collapse in homogeneous rock.

  3. Osteoarthritic cartilage is more homogeneous than healthy cartilage

    DEFF Research Database (Denmark)

    Qazi, Arish A; Dam, Erik B; Nielsen, Mads

    2007-01-01

    it evolves as a consequence to disease and thereby can be used as a progression biomarker. MATERIALS AND METHODS: A total of 283 right and left knees from 159 subjects aged 21 to 81 years were scanned using a Turbo 3D T1 sequence on a 0.18-T MRI Esaote scanner. The medial compartment of the tibial cartilage...... sheet was segmented using a fully automatic voxel classification scheme based on supervised learning. From the segmented cartilage sheet, homogeneity was quantified by measuring entropy from the distribution of signal intensities inside the compartment. Each knee was examined by radiography...... of the region was evaluated by testing for overfitting. Three different regularization techniques were evaluated for reducing overfitting errors. RESULTS: The P values for separating the different groups based on cartilage homogeneity were 2 x 10(-5) (KL 0 versus KL 1) and 1 x 10(-7) (KL 0 versus KL >0). Using...

  4. Oscillatory Dynamics of One-Dimensional Homogeneous Granular Chains

    Science.gov (United States)

    Starosvetsky, Yuli; Jayaprakash, K. R.; Hasan, Md. Arif; Vakakis, Alexander F.

    The acoustics of the homogeneous granular chains has been studied extensively both numerically and experimentally in the references cited in the previous chapters. This chapter focuses on the oscillatory behavior of finite dimensional homogeneous granular chains. It is well known that normal vibration modes are the building blocks of the vibrations of linear systems due to the applicability of the principle of superposition. One the other hand, nonlinear theory is deprived of such a general superposition principle (although special cases of nonlinear superpositions do exist), but nonlinear normal modes ‒ NNMs still play an important role in the forced and resonance dynamics of these systems. In their basic definition [1], NNMs were defined as time-periodic nonlinear oscillations of discrete or continuous dynamical systems where all coordinates (degrees-of-freedom) oscillate in-unison with the same frequency; further extensions of this definition have been considered to account for NNMs of systems with internal resonances [2]...

  5. A homogeneous cooling scheme investigation for high power slab laser

    Science.gov (United States)

    He, Jianguo; Lin, Weiran; Fan, Zhongwei; Chen, Yanzhong; Ge, Wenqi; Yu, Jin; Liu, Hao; Mo, Zeqiang; Fan, Lianwen; Jia, Dan

    2017-10-01

    The forced convective heat transfer with the advantages of reliability and durability is widely used in cooling the laser gain medium. However, a flow direction induced temperature gradient always appears. In this paper, a novel cooling configuration based on longitudinal forced convective heat transfer is presented. In comparison with two different types of configurations, it shows a more efficient heat transfer and more homogeneous temperature distribution. The investigation of the flow rate reveals that the higher flow rate the better cooling performance. Furthermore, the simulation results with 20 L/min flow rate shows an adequate temperature level and temperature homogeneity which keeps a lower hydrostatic pressure in the flow path.

  6. Highly transparent films from carboxymethylated microfibrillated cellulose: The effect of multiple homogenization steps on key properties

    DEFF Research Database (Denmark)

    Siró, Istvan; Plackett, David; Hedenqvist, M.

    2011-01-01

    We produced microfibrillated cellulose by passing carboxymethylated sulfite-softwood-dissolving pulp with a relatively low hemicellulose content (4.5%) through a high-shear homogenizer. The resulting gel was subjected to as many as three additional homogenization steps and then used to prepare...... solvent-cast films. The optical, mechanical, and oxygen-barrier properties of these films were determined. A reduction in the quantity and appearance of large fiber fragments and fiber aggregates in the films as a function of increasing homogenization was illustrated with optical microscopy, atomic force...... microscopy, and scanning electron microscopy. Film opacity decreased with increasing homogenization, and the use of three additional homogenization steps after initial gel production resulted in highly transparent films. The oxygen permeability of the films was not significantly influenced by the degree...

  7. MO-F-CAMPUS-I-02: Accuracy in Converting the Average Breast Dose Into the Mean Glandular Dose (MGD) Using the F-Factor in Cone Beam Breast CT- a Monte Carlo Study Using Homogeneous and Quasi-Homogeneous Phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Lai, C; Zhong, Y; Wang, T; Shaw, C [UT MD Anderson Cancer Center, Houston, TX (United States)

    2015-06-15

    Purpose: To investigate the accuracy in estimating the mean glandular dose (MGD) for homogeneous breast phantoms by converting from the average breast dose using the F-factor in cone beam breast CT. Methods: EGSnrc-based Monte Carlo codes were used to estimate the MGDs. 13-cm in diameter, 10-cm high hemi-ellipsoids were used to simulate pendant-geometry breasts. Two different types of hemi-ellipsoidal models were employed: voxels in quasi-homogeneous phantoms were designed as either adipose or glandular tissue while voxels in homogeneous phantoms were designed as the mixture of adipose and glandular tissues. Breast compositions of 25% and 50% volume glandular fractions (VGFs), defined as the ratio of glandular tissue voxels to entire breast voxels in the quasi-homogeneous phantoms, were studied. These VGFs were converted into glandular fractions by weight and used to construct the corresponding homogeneous phantoms. 80 kVp x-rays with a mean energy of 47 keV was used in the simulation. A total of 109 photons were used to image the phantoms and the energies deposited in the phantom voxels were tallied. Breast doses in homogeneous phantoms were averaged over all voxels and then used to calculate the MGDs using the F-factors evaluated at the mean energy of the x-rays. The MGDs for quasi-homogeneous phantoms were computed directly by averaging the doses over all glandular tissue voxels. The MGDs estimated for the two types of phantoms were normalized to the free-in-air dose at the iso-center and compared. Results: The normalized MGDs were 0.756 and 0.732 mGy/mGy for the 25% and 50% VGF homogeneous breasts and 0.761 and 0.733 mGy/mGy for the corresponding quasi-homogeneous breasts, respectively. The MGDs estimated for the two types of phantoms were similar within 1% in this study. Conclusion: MGDs for homogeneous breast models may be adequately estimated by converting from the average breast dose using the F-factor.

  8. MO-F-CAMPUS-I-02: Accuracy in Converting the Average Breast Dose Into the Mean Glandular Dose (MGD) Using the F-Factor in Cone Beam Breast CT- a Monte Carlo Study Using Homogeneous and Quasi-Homogeneous Phantoms

    International Nuclear Information System (INIS)

    Lai, C; Zhong, Y; Wang, T; Shaw, C

    2015-01-01

    Purpose: To investigate the accuracy in estimating the mean glandular dose (MGD) for homogeneous breast phantoms by converting from the average breast dose using the F-factor in cone beam breast CT. Methods: EGSnrc-based Monte Carlo codes were used to estimate the MGDs. 13-cm in diameter, 10-cm high hemi-ellipsoids were used to simulate pendant-geometry breasts. Two different types of hemi-ellipsoidal models were employed: voxels in quasi-homogeneous phantoms were designed as either adipose or glandular tissue while voxels in homogeneous phantoms were designed as the mixture of adipose and glandular tissues. Breast compositions of 25% and 50% volume glandular fractions (VGFs), defined as the ratio of glandular tissue voxels to entire breast voxels in the quasi-homogeneous phantoms, were studied. These VGFs were converted into glandular fractions by weight and used to construct the corresponding homogeneous phantoms. 80 kVp x-rays with a mean energy of 47 keV was used in the simulation. A total of 109 photons were used to image the phantoms and the energies deposited in the phantom voxels were tallied. Breast doses in homogeneous phantoms were averaged over all voxels and then used to calculate the MGDs using the F-factors evaluated at the mean energy of the x-rays. The MGDs for quasi-homogeneous phantoms were computed directly by averaging the doses over all glandular tissue voxels. The MGDs estimated for the two types of phantoms were normalized to the free-in-air dose at the iso-center and compared. Results: The normalized MGDs were 0.756 and 0.732 mGy/mGy for the 25% and 50% VGF homogeneous breasts and 0.761 and 0.733 mGy/mGy for the corresponding quasi-homogeneous breasts, respectively. The MGDs estimated for the two types of phantoms were similar within 1% in this study. Conclusion: MGDs for homogeneous breast models may be adequately estimated by converting from the average breast dose using the F-factor

  9. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    Science.gov (United States)

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized

  10. Homogeneous Nucleation Rate Measurements in Supersaturated Water Vapor

    Czech Academy of Sciences Publication Activity Database

    Brus, David; Ždímal, Vladimír; Smolík, Jiří

    2008-01-01

    Roč. 129, č. 17 (2008), , 174501-1-174501-8 ISSN 0021-9606 R&D Projects: GA ČR GA101/05/2214 Institutional research plan: CEZ:AV0Z40720504 Keywords : homogeneous nucleation * water * diffusion chamber Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.149, year: 2008

  11. Continuous energy Monte Carlo method based homogenization multi-group constants calculation

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    The efficiency of the standard two-step reactor physics calculation relies on the accuracy of multi-group constants from the assembly-level homogenization process. In contrast to the traditional deterministic methods, generating the homogenization cross sections via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data bank can be used for a wide range of applications, resulting in the versatility using Monte Carlo codes for homogenization. As the first stage to realize Monte Carlo based lattice homogenization, the track length scheme is used as the foundation of cross section generation, which is straight forward. The scattering matrix and Legendre components, however, require special techniques. The Scattering Event method was proposed to solve the problem. There are no continuous energy counterparts in the Monte Carlo calculation for neutron diffusion coefficients. P 1 cross sections were used to calculate the diffusion coefficients for diffusion reactor simulator codes. B N theory is applied to take the leakage effect into account when the infinite lattice of identical symmetric motives is assumed. The MCMC code was developed and the code was applied in four assembly configurations to assess the accuracy and the applicability. At core-level, A PWR prototype core is examined. The results show that the Monte Carlo based multi-group constants behave well in average. The method could be applied to complicated configuration nuclear reactor core to gain higher accuracy. (authors)

  12. Systematic assembly homogenization and local flux reconstruction for nodal method calculations of fast reactor power distributions

    International Nuclear Information System (INIS)

    Dorning, J.J.

    1991-01-01

    A simultaneous pin lattice cell and fuel bundle homogenization theory has been developed for use with nodal diffusion calculations of practical reactors. The theoretical development of the homogenization theory, which is based on multiple-scales asymptotic expansion methods carried out through fourth order in a small parameter, starts from the transport equation and systematically yields: a cell-homogenized bundled diffusion equation with self-consistent expressions for the cell-homogenized cross sections and diffusion tensor elements; and a bundle-homogenized global reactor diffusion equation with self-consistent expressions for the bundle-homogenized cross sections and diffusion tensor elements. The continuity of the angular flux at cell and bundle interfaces also systematically yields jump conditions for the scaler flux or so-called flux discontinuity factors on the cell and bundle interfaces in terms of the two adjacent cell or bundle eigenfunctions. The expressions required for the reconstruction of the angular flux or the 'de-homogenization' theory were obtained as an integral part of the development; hence the leading order transport theory angular flux is easily reconstructed throughout the reactor including the regions in the interior of the fuel bundles or computational nodes and in the interiors of the pin lattice cells. The theoretical development shows that the exact transport theory angular flux is obtained to first order from the whole-reactor nodal diffusion calculations, done using the homogenized nuclear data and discontinuity factors, is a product of three computed quantities: a ''cell shape function''; a ''bundle shape function''; and a ''global shape function''. 10 refs

  13. Properties of lotus seed starch-glycerin monostearin complexes formed by high pressure homogenization.

    Science.gov (United States)

    Chen, Bingyan; Zeng, Shaoxiao; Zeng, Hongliang; Guo, Zebin; Zhang, Yi; Zheng, Baodong

    2017-07-01

    Starch-lipid complexes were prepared using lotus seed starch (LS) and glycerin monostearate (GMS) via a high pressure homogenization (HPH) process, and the effect of HPH on the physicochemical properties of LS-GMS complexes was investigated. The results of Fourier transform infrared spectroscopy and complex index analysis showed that LS-GMS complexes were formed at 40MPa by HPH and the complex index increased with the increase of homogenization pressure. Scanning electron microscopy displayed LS-GMS complexes present more nest-shape structure with increasing homogenization pressure. X-ray diffraction and differential scanning calorimetry results revealed that V-type crystalline polymorph was formed between LS and GMS, with higher homogenization pressure producing an increasingly stable complex. LS-GMS complex inhibited starch granules swelling, solubility and pasting development, which further reduced peak and breakdown viscosity. During storage, LS-GMS complexes prepared by 70-100MPa had higher Avrami exponent values and lower recrystallization rates compared with native starch, which suggested a lower retrogradation trendency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Analysis of the premeability characteristics along rough-walled fractures using a homogenization method

    International Nuclear Information System (INIS)

    Chae, Byung Gon; Choi, Jung Hae; Ichikawa, Yasuaki; Seo, Yong Seok

    2012-01-01

    To compute a permeability coefficient along a rough fracture that takes into account the fracture geometry, this study performed detailed measurements of fracture roughness using a confocal laser scanning microscope, a quantitative analysis of roughness using a spectral analysis, and a homogenization analysis to calculate the permeability coefficient on the microand macro-scale. The homogenization analysis is a type of perturbation theory that characterizes the behavior of microscopically inhomogeneous material with a periodic boundary condition in the microstructure. Therefore, it is possible to analyze accurate permeability characteristics that are represented by the local effect of the fracture geometry. The Cpermeability coefficients that are calculated using the homogenization analysis for each rough fracture model exhibit an irregular distribution and do not follow the relationship of the cubic law. This distribution suggests that the permeability characteristics strongly depend on the geometric conditions of the fractures, such as the roughness and the aperture variation. The homogenization analysis may allow us to produce more accurate results than are possible with the preexisting equations for calculating permeability.

  15. Homogenization of Doppler broadening in spin-noise spectroscopy

    Science.gov (United States)

    Petrov, M. Yu.; Ryzhov, I. I.; Smirnov, D. S.; Belyaev, L. Yu.; Potekhin, R. A.; Glazov, M. M.; Kulyasov, V. N.; Kozlov, G. G.; Aleksandrov, E. B.; Zapasskii, V. S.

    2018-03-01

    The spin-noise spectroscopy, being a nonperturbative linear optics tool, is still reputed to reveal a number of capabilities specific to nonlinear optics techniques. The effect of the Doppler broadening homogenization discovered in this work essentially widens these unique properties of spin-noise spectroscopy. We investigate spin noise of a classical system—cesium atoms vapor with admixture of buffer gas—by measuring the spin-induced Faraday rotation fluctuations in the region of D 2 line. The line, under our experimental conditions, is strongly inhomogeneously broadened due to the Doppler effect. Despite that, optical spectrum of the spin-noise power has the shape typical for the homogeneously broadened line with a dip at the line center. This fact is in stark contrast with the results of previous studies of inhomogeneous quantum dot ensembles and Doppler broadened atomic systems. In addition, the two-color spin-noise measurements have shown, in a highly spectacular way, that fluctuations of the Faraday rotation within the line are either correlated or anticorrelated depending on whether the two wavelengths lie on the same side or on different sides of the resonance. The experimental data are interpreted in the frame of the developed theoretical model which takes into account both kinetics and spin dynamics of Cs atoms. It is shown that the unexpected behavior of the Faraday rotation noise spectra and effective homogenization of the optical transition in the spin-noise measurements are related to smallness of the momentum relaxation time of the atoms as compared with their spin-relaxation time. Our findings demonstrate abilities of spin-noise spectroscopy for studying dynamic properties of inhomogeneously broadened ensembles of randomly moving spins.

  16. Field homogeneity improvement of maglev NdFeB magnetic rails from joints.

    Science.gov (United States)

    Li, Y J; Dai, Q; Deng, C Y; Sun, R X; Zheng, J; Chen, Z; Sun, Y; Wang, H; Yuan, Z D; Fang, C; Deng, Z G

    2016-01-01

    An ideal magnetic rail should provide a homogeneous magnetic field along the longitudinal direction to guarantee the reliable friction-free operation of high temperature superconducting (HTS) maglev vehicles. But in reality, magnetic field inhomogeneity may occur due to lots of reasons; the joint gap is the most direct one. Joint gaps inevitably exist between adjacent segments and influence the longitudinal magnetic field homogeneity above the rail since any magnetic rails are consisting of many permanent magnet segments. To improve the running performance of maglev systems, two new rail joints are proposed based on the normal rail joint, which are named as mitered rail joint and overlapped rail joint. It is found that the overlapped rail joint has a better effect to provide a competitive homogeneous magnetic field. And the further structure optimization has been done to ensure maglev vehicle operation as stable as possible when passing through those joint gaps. The results show that the overlapped rail joint with optimal parameters can significantly reduce the magnetic field inhomogeneity comparing with the other two rail joints. In addition, an appropriate gap was suggested when balancing the thermal expansion of magnets and homogenous magnetic field, which is considered valuable references for the future design of the magnetic rails.

  17. Evaluation of homogeneity and dose conformity in IMRT planning in prostate radiotherapy

    International Nuclear Information System (INIS)

    Lopes, Juliane S.; Leidens, Matheus; Estacio, Daniela R.; Razera, Ricardo A.Z.; Streck, Elaine E.; Silva, Ana M.M. da

    2015-01-01

    The goal of this study was to evaluate the dose distribution homogeneity and conformity of radiation therapy plans of prostate cancer using IMRT. Data from 34 treatment plans of Hospital Sao Lucas of PUCRS, where those plans were executed, were retrospectively analyzed. All of them were done with 6MV X-rays from a linear accelerator CLINAC IX, and the prescription doses varied between 60 and 74 Gy. Analyses showing the homogeneity and conformity indices for the dose distribution of those plans were made. During these analyses, some comparisons with the traditional radiation therapy planning technic, the 3D-CRT, were discussed. The results showed that there is no correlation between the prescribed dose and the homogeneity and conformity indices, indicating that IMRT works very well even for higher doses. Furthermore, a comparison between the results obtained and the recommendations of ICRU 83 was carried out. It has also been observed that the indices were really close to the ideal values. 82.4% of the cases showed a difference below 5% of the ideal value for the index of conformity, and 88.2% showed a difference below 10% for the homogeneity index. Concluding, it is possible to confirm the quality of the analyzed radiation therapy plans of prostate cancer using IMRT. (author)

  18. A computational analysis on homogeneous-heterogeneous mechanism in Carreau fluid flow

    Science.gov (United States)

    Khan, Imad; Rehman, Khalil Ur; Malik, M. Y.; Shafquatullah

    2018-03-01

    In this article magnetohydrodynamic Carreau fluid flow towards stretching cylinder is considered in the presence of homogeneous-heterogeneous reactions effect. The flow model is structured by utilizing theoretical grounds. For the numerical solution a shooting method along with Runge-Kutta algorithm is executed. The outcomes are provided through graphs. It is observed that the Carreau fluid concentration shows decline values via positive iterations of homogeneous-heterogeneous reaction parameters towards both shear thinning and thickening case. The present work is certified through comparison with already existing literature in a limiting sense.

  19. Optimal truss and frame design from projected homogenization-based topology optimization

    DEFF Research Database (Denmark)

    Larsen, S. D.; Sigmund, O.; Groen, J. P.

    2018-01-01

    In this article, we propose a novel method to obtain a near-optimal frame structure, based on the solution of a homogenization-based topology optimization model. The presented approach exploits the equivalence between Michell’s problem of least-weight trusses and a compliance minimization problem...... using optimal rank-2 laminates in the low volume fraction limit. In a fully automated procedure, a discrete structure is extracted from the homogenization-based continuum model. This near-optimal structure is post-optimized as a frame, where the bending stiffness is continuously decreased, to allow...

  20. On micro-meso relations homogenizing electrical properties of transversely cracked laminated composites

    KAUST Repository

    Lubineau, Gilles

    2013-11-01

    A practical way to track the development of transverse cracking in a laminated composite is to monitor the change of its electrical resistance. Yet, the relations between transverse cracking and the global modification of resistivity is still unclear that makes difficult to interpret these non-destructive-testing results. Here, we introduce the homogenization process that defines at the meso scale an equivalent homogeneous ply that is energetically equivalent to the cracked one. It is shown that this equivalent ply mainly depends on the cracking level while it can be considered independent on the rest of the laminated structure. The direct consequence is that the meso scale is a pertinent one to perform the homogenization. Then, non-destructive electrical measurements can be considered as a reliable technique to access meso scale damage indicators. © 2013 Elsevier Ltd.

  1. Multifractal spectra in homogeneous shear flow

    Science.gov (United States)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  2. Homogenization of steady-state creep of porous metals using three-dimensional microstructural reconstructions

    DEFF Research Database (Denmark)

    Kwok, Kawai; Boccaccini, Dino; Persson, Åsa Helen

    2016-01-01

    The effective steady-state creep response of porous metals is studied by numerical homogenization and analytical modeling in this paper. The numerical homogenization is based on finite element models of three-dimensional microstructures directly reconstructed from tomographic images. The effects ...... model, and closely matched by the Gibson-Ashby compression and the Ramakrishnan-Arunchalam creep models. [All rights reserved Elsevier]....

  3. Uniqueness in the inverse boundary value problem for piecewise homogeneous anisotropic elasticity

    OpenAIRE

    Cârstea, Cătălin I.; Honda, Naofumi; Nakamura, Gen

    2016-01-01

    Consider a three dimensional piecewise homogeneous anisotropic elastic medium $\\Omega$ which is a bounded domain consisting of a finite number of bounded subdomains $D_\\alpha$, with each $D_\\alpha$ a homogeneous elastic medium. One typical example is a finite element model with elements with curvilinear interfaces for an ansiotropic elastic medium. Assuming the $D_\\alpha$ are known and Lipschitz, we are concerned with the uniqueness in the inverse boundary value problem of identifying the ani...

  4. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    KAUST Repository

    Gao, Kai

    2015-06-05

    The development of reliable methods for upscaling fine-scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. Therefore, we have proposed a numerical homogenization algorithm based on multiscale finite-element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that was similar to the rotated staggered-grid finite-difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity in which the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  5. Desertification, salinization, and biotic homogenization in a dryland river ecosystem

    Science.gov (United States)

    Miyazono, S.; Patino, Reynaldo; Taylor, C.M.

    2015-01-01

    This study determined long-term changes in fish assemblages, river discharge, salinity, and local precipitation, and examined hydrological drivers of biotic homogenization in a dryland river ecosystem, the Trans-Pecos region of the Rio Grande/Rio Bravo del Norte (USA/Mexico). Historical (1977-1989) and current (2010-2011) fish assemblages were analyzed by rarefaction analysis (species richness), nonmetric multidimensional scaling (composition/variability), multiresponse permutation procedures (composition), and paired t-test (variability). Trends in hydrological conditions (1970s-2010s) were examined by Kendall tau and quantile regression, and associations between streamfiow and specific conductance (salinity) by generalized linear models. Since the 1970s, species richness and variability of fish assemblages decreased in the Rio Grande below the confluence with the Rio Conchos (Mexico), a major tributary, but not above it. There was increased representation of lower-flow/higher-salinity tolerant species, thus making fish communities below the confluence taxonomically and functionally more homogeneous to those above it. Unlike findings elsewhere, this biotic homogenization was due primarily to changes in the relative abundances of native species. While Rio Conchos discharge was > 2-fold higher than Rio Grande discharge above their confluence, Rio Conchos discharge decreased during the study period causing Rio Grande discharge below the confluence to also decrease. Rio Conchos salinity is lower than Rio Grande salinity above their confluence and, as Rio Conchos discharge decreased, it caused Rio Grande salinity below the confluence to increase (reduced dilution). Trends in discharge did not correspond to trends in precipitation except at extreme-high (90th quantile) levels. In conclusion, decreasing discharge from the Rio Conchos has led to decreasing flow and increasing salinity in the Rio Grande below the confluence. This spatially uneven desertification and

  6. Homogenization Effect on Nanostructure and Conductivity of Polyaniline Nanofibre Synthesis by Mini-Emulsion Polymerization Technique

    Science.gov (United States)

    Mohammad, M.; Kamarudin, S.; Mohamed, N. H.; Asim, N.; Sopian, K.

    2017-12-01

    Nanofibre polyaniline (n-PANI) was synthesized by mini-emulsion polymerization technique between aniline monomer and ammonium persulfate as an oxidant using homogenizer. The synthesis was performed by optimizing mixing speed from 10,000 to 30,000 rpm and time reaction between 0.5 to 24 hours at fixed monomer to oxidant molar ratio 4:1. An attempt has been made to investigate on how the speed of homogenizer affects the size and conductivity of n-PANI. The formation of n-PANI chain was confirmed by Fourier transform infrared spectroscopy (FTIR). The X-ray diffraction (XRD) spectra revealed PANI crystalline nature. Hall effect measurement used indicated that the electrical conductivity of n-PANI is increased with homogenizer speed from 5.2 to 17.5 Scm-1. The morphological properties of n-PANI performed by scanning electron microscopy (SEM) show the decreasing size of n-PANI from 50-60 nm to 20-30 nm with the increment homogenizer speed. This study indicated the optimum speed parameter of homogenizer play a role in reducing the nanostructured size and thus, increasing the electrical conductivity of n-PANI.

  7. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  8. Homogenization Kinetics of a Nickel-based Superalloy Produced by Powder Bed Fusion Laser Sintering.

    Science.gov (United States)

    Zhang, Fan; Levine, Lyle E; Allen, Andrew J; Campbell, Carelyn E; Lass, Eric A; Cheruvathur, Sudha; Stoudt, Mark R; Williams, Maureen E; Idell, Yaakov

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  9. Photo-electret effects in homogenous semiconductors

    International Nuclear Information System (INIS)

    Nabiev, G.A.

    2004-01-01

    In the given work is shown the opportunity and created the theory of photo-electret condition in semiconductors with Dember mechanism of photo-voltage generation. Photo-electret of such type can be created, instead of traditional and without an external field as a result of only one illumination. Polar factor, in this case, is the distinction of electrons and holes mobility. Considered the multilayered structure with homogeneous photoactive micro areas shared by the layers, which are interfering to alignment of carriers concentration. We consider, that the homogeneous photoactive areas contain deep levels of stick. Because of addition of elementary photo voltage in separate micro photo cells it is formed the abnormal-large photo voltage (APV-effect). Let's notice, that Dember photo-voltage in a separate micro photo-cell ≤kT/q. From the received expressions, in practically important, special case, when quasi- balance between valent zone and stick levels established in much more smaller time, than free hole lifetime, and we received, that photo-voltage is relaxing. Comparing of the received expressions with the laws of photo voltage attenuation in p-n- junction structures shows their identity; the difference is only in absolute meanings of photo voltage. During the illumination in the semiconductor are created the superfluous concentration of charge carriers and part from them stays at deep levels. At de-energizing light there is a gradual generation of carriers located at these levels

  10. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura; Kozlov, Sergey M.; Cavallo, Luigi

    2018-01-01

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  11. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    KAUST Repository

    Falivene, Laura

    2018-05-08

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  12. A comparison of non-homogeneous Markov regression models with application to Alzheimer’s disease progression

    Science.gov (United States)

    Hubbard, R. A.; Zhou, X.H.

    2011-01-01

    Markov regression models are useful tools for estimating the impact of risk factors on rates of transition between multiple disease states. Alzheimer’s disease (AD) is an example of a multi-state disease process in which great interest lies in identifying risk factors for transition. In this context, non-homogeneous models are required because transition rates change as subjects age. In this report we propose a non-homogeneous Markov regression model that allows for reversible and recurrent disease states, transitions among multiple states between observations, and unequally spaced observation times. We conducted simulation studies to demonstrate performance of estimators for covariate effects from this model and compare performance with alternative models when the underlying non-homogeneous process was correctly specified and under model misspecification. In simulation studies, we found that covariate effects were biased if non-homogeneity of the disease process was not accounted for. However, estimates from non-homogeneous models were robust to misspecification of the form of the non-homogeneity. We used our model to estimate risk factors for transition to mild cognitive impairment (MCI) and AD in a longitudinal study of subjects included in the National Alzheimer’s Coordinating Center’s Uniform Data Set. Using our model, we found that subjects with MCI affecting multiple cognitive domains were significantly less likely to revert to normal cognition. PMID:22419833

  13. Solidification Segregation and Homogenization Behavior of 1Cr-1.25Mo-0.25V Steel Ingot

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong-Bae [Dae-gu Mechatronics and Materials Institute, Daegu (Korea, Republic of); Na, Young-Sang; Seo, Seong-Moon [Korea Institute of Materials Science, Changwon (Korea, Republic of); Lee, Je-Hyun [Changwon National University, Changwon (Korea, Republic of)

    2016-09-15

    As a first step to optimizing the homogenization heat treatment following high temperature upset forging, the solidification segregation and the homogenization behaviors of solute elements were quantitatively analyzed for 1Cr-1.25Mo-0.25V steel ingot by electron probe micro-analysis (EPMA). The random sampling approach, which was designed to generate continuous compositional profiles of each solute element, was employed to clarify the segregation and homogenization behaviors. In addition, ingot castings of lab-scale and a 16-ton-sized 1Cr-1.25Mo-0.25V steel were simulated using the finite element method in three dimensions to understand the size effect of the ingot on the microsegregation and its reduction during the homogenization heat treatment. It was found that the microsegregation in a large-sized ingot was significantly reduced by the promotion of solid state diffusion due to the extremely low cooling rate. On the other hand, from the homogenization point of view, increasing the ingot size causes a dramatic increase in the dendrite arm spacing, and hence the homogenization of microsegregation in a large-sized ingot appears to be practically difficult.

  14. Unified double- and single-sided homogeneous Green’s function representations

    Science.gov (United States)

    van der Neut, Joost; Slob, Evert

    2016-01-01

    In wave theory, the homogeneous Green’s function consists of the impulse response to a point source, minus its time-reversal. It can be represented by a closed boundary integral. In many practical situations, the closed boundary integral needs to be approximated by an open boundary integral because the medium of interest is often accessible from one side only. The inherent approximations are acceptable as long as the effects of multiple scattering are negligible. However, in case of strongly inhomogeneous media, the effects of multiple scattering can be severe. We derive double- and single-sided homogeneous Green’s function representations. The single-sided representation applies to situations where the medium can be accessed from one side only. It correctly handles multiple scattering. It employs a focusing function instead of the backward propagating Green’s function in the classical (double-sided) representation. When reflection measurements are available at the accessible boundary of the medium, the focusing function can be retrieved from these measurements. Throughout the paper, we use a unified notation which applies to acoustic, quantum-mechanical, electromagnetic and elastodynamic waves. We foresee many interesting applications of the unified single-sided homogeneous Green’s function representation in holographic imaging and inverse scattering, time-reversed wave field propagation and interferometric Green’s function retrieval. PMID:27436983

  15. Engineered CHO cells for production of diverse, homogeneous glycoproteins

    DEFF Research Database (Denmark)

    Yang, Zhang; Wang, Shengjun; Halim, Adnan

    2015-01-01

    Production of glycoprotein therapeutics in Chinese hamster ovary (CHO) cells is limited by the cells' generic capacity for N-glycosylation, and production of glycoproteins with desirable homogeneous glycoforms remains a challenge. We conducted a comprehensive knockout screen of glycosyltransferas...

  16. Multigrid Finite Element Method in Calculation of 3D Homogeneous and Composite Solids

    Directory of Open Access Journals (Sweden)

    A.D. Matveev

    2016-12-01

    Full Text Available In the present paper, a method of multigrid finite elements to calculate elastic three-dimensional homogeneous and composite solids under static loading has been suggested. The method has been developed based on the finite element method algorithms using homogeneous and composite three-dimensional multigrid finite elements (MFE. The procedures for construction of MFE of both rectangular parallelepiped and complex shapes have been shown. The advantages of MFE are that they take into account, following the rules of the microapproach, heterogeneous and microhomogeneous structures of the bodies, describe the three-dimensional stress-strain state (without any simplifying hypotheses in homogeneous and composite solids, as well as generate small dimensional discrete models and numerical solutions with a high accuracy.

  17. Controllable synthesis of nickel bicarbonate nanocrystals with high homogeneity for a high-performance supercapacitor

    Science.gov (United States)

    Gu, Jianmin; Liu, Xin; Wang, Zhuang; Bian, Zhenpan; Jin, Cuihong; Sun, Xiao; Yin, Baipeng; Wu, Tianhui; Wang, Lin; Tang, Shoufeng; Wang, Hongchao; Gao, Faming

    2017-08-01

    The electrochemical performance of supercapacitors might be associated with the homogeneous structure of the electrode materials. However, the relationship between the degree of uniformity for the electrode materials and the electrochemical performance of the supercapacitor is not clear. Herein, we synthesize two types of nickel bicarbonate nanocrystals with different degrees of uniformity to investigate this relationship. As the electroactive material, the nickel bicarbonate nanocrystals with a homogeneous structure could provide a larger space and offer more exposed atoms for the electrochemical reaction than the nanocrystals with a heterogeneous structure. The homogeneous nickel bicarbonate nanocrystals exhibit better electrochemical performance and show excellent specific capacitance (1596 F g-1 at 2 A g-1 and 1260 F g-1 at 30 A g-1), which is approximately twice that of the heterogeneous nickel bicarbonate nanocrystals. The cycling stability for the homogeneity (˜80%) is higher than the inhomogeneity (˜61%) at a high current density of 5 A g-1.

  18. Homogenate extraction of crude polysaccharides from Lentinus edodes and evaluation of the antioxidant activity

    Directory of Open Access Journals (Sweden)

    Leqin KE

    2016-01-01

    Full Text Available Abstract Crude polysaccharides of Lentinus edodes were extracted using homogenate method. Factors affecting the yield of crude polysaccharides were investigated and optimized by response surface methodology. The homogenate extraction method was compared with traditional heating extraction method. The antioxidant activity of crude polysaccharides from Lentinus edodes was evaluated. Results showed that, the optimal conditions of homogenate extraction were as follows: solvent pH, 10; liquid-solid ratio, 30: 1 (mL: g, extraction time, 66 s; number of extraction, 1. Under these conditions, the yield of crude polysaccharides was (13.2 ± 0.9%, which was 29.82% higher than that of traditional heating extraction. Crude polysaccharides of Lentinus edodes had good DPPH scavenging activity. Compared with the traditional heating extraction, the homogenate extraction had notable advantages including good extraction yield, short extraction time and low extraction temperature. It is an efficient way to extract crude polysaccharides from Lentinus edodes.

  19. Layer potentials, Kac's problem, and refined Hardy inequality on homogeneous Carnot groups

    OpenAIRE

    Ruzhansky, Michael; Suragan, Durvudkhan

    2017-01-01

    We propose the analogues of boundary layer potentials for the sub-Laplacian on homogeneous Carnot groups/stratified Lie groups and prove continuity results for them. In particular, we show continuity of the single layer potential and establish the Plemelj type jump relations for the double layer potential. We prove sub-Laplacian adapted versions of the Stokes theorem as well as of Green's first and second formulae on homogeneous Carnot groups. Several applications to boundary value problems a...

  20. Tensor harmonic analysis on homogenous space

    International Nuclear Information System (INIS)

    Wrobel, G.

    1997-01-01

    The Hilbert space of tensor functions on a homogenous space with the compact stability group is considered. The functions are decomposed onto a sum of tensor plane waves (defined in the text), components of which are transformed by irreducible representations of the appropriate transformation group. The orthogonality relation and the completeness relation for tensor plane waves are found. The decomposition constitutes a unitary transformation, which allows to obtain the Parseval equality. The Fourier components can be calculated by means of the Fourier transformation, the form of which is given explicitly. (author)

  1. Microsegregation of heat and homogenization treatments in uranium-niobium alloys (U-Nb)

    International Nuclear Information System (INIS)

    Leal, J.F.

    1988-01-01

    In the following sections microsegration results in U-3,6 Wt% Nb and U-6,1 Wt% Nb alloys casted in noconsumable electrode arc furnace are presented. The microsegration is studied qualitatively by optical microscopy and quantitatively by electron microprobe. The degree of homogenization has been measured after 800 and 850 0 C heat treatments in tubular resistive furnace. The microstructures after heat treatments are quantitatively analysed to check effects on the casting structures, mainly the variations in solute along the dendrite arm spacing. Some solidification phenomena are then discussed on reference to theorical models of dendritic solidification, including microstructure and microsegregation. The experimental results are compared to theoretical on basis of initial and residual microsegregation after homogenization treatments. The times required for homogenization of the alloys are also discussed in function of the microsegregation from casting structures and the temperatures of the treatments. (author) [pt

  2. Homogenization and Optimal Control S. Kesavan The Institute of ...

    Indian Academy of Sciences (India)

    Homogenization permits us to study the global behaviour of heterogeneous bodies with a lot of heterogeneities whose dimen- sions are small compared to those of the body. • It describes the macroscopic behaviour of systems with a fine microstructure. 2 ...

  3. Technical Note: Homogeneity of Gafchromic EBT2 film

    International Nuclear Information System (INIS)

    Hartmann, Bernadette; Martisikova, Maria; Jaekel, Oliver

    2010-01-01

    Purpose: The self-developing Gafchromic EBT film is a radiochromic film, widely used for relative photon dosimetry. Recently, the manufacturer has replaced the well-investigated EBT film by the new Gafchromic EBT2 film. It has the same sensitive component and, in addition, it contains a yellow marker dye in order to protect the film against ambient light exposure and to serve as a base for corrections of small differences in film response. Furthermore, the configuration of the film layers as well as the binder material have been changed in comparison to the EBT film. When investigating the properties of EBT2 film, all characteristics were found to be similar to those of EBT film, except for the film response homogeneity. Thus, in this article special focus was put on examining the homogeneity of EBT2 film. Methods: A scan protocol established for EBT film and published previously was used. The uniformity of the film coloration was investigated for unirradiated and irradiated EBT2 film sheets. The dose response of EBT2 film was measured and the influence of film inhomogeneities on dose determination was evaluated. Results: Inhomogeneities in pixel values of up to ±3.7% within one film were detected. The relative inhomogeneities were found to be approximately independent of the dose. Nonuniformities of the film response lead to uncertainties in dose determination of ±8.7% at 1 Gy. When using net optical densities for dose calibration, uncertainties in dose determination amount to more than ±6%. Conclusions: EBT2 films from the lot investigated in this study show response inhomogeneities, which lead to uncertainties in dose determination exceeding the commonly accepted tolerance levels. It is important to test further EBT2 lots regarding homogeneity before using the film in clinical routine.

  4. Autoregressive Processes in Homogenization of GNSS Tropospheric Data

    Science.gov (United States)

    Klos, A.; Bogusz, J.; Teferle, F. N.; Bock, O.; Pottiaux, E.; Van Malderen, R.

    2016-12-01

    Offsets due to changes in hardware equipment or any other artificial event are all a subject of a task of homogenization of tropospheric data estimated within a processing of Global Navigation Satellite System (GNSS) observables. This task is aimed at identifying exact epochs of offsets and estimate their magnitudes since they may artificially under- or over-estimate trend and its uncertainty delivered from tropospheric data and used in climate studies. In this research, we analysed a common data set of differences of Integrated Water Vapour (IWV) from GPS and ERA-Interim (1995-2010) provided for a homogenization group working within ES1206 COST Action GNSS4SWEC. We analysed daily IWV records of GPS and ERA-Interim in terms of trend, seasonal terms and noise model with Maximum Likelihood Estimation in Hector software. We found that this data has a character of autoregressive process (AR). Basing on this analysis, we performed Monte Carlo simulations of 25 years long data with two different noise types: white as well as combination of white and autoregressive and also added few strictly defined offsets. This synthetic data set of exactly the same character as IWV from GPS and ERA-Interim was then subjected to a task of manual and automatic/statistical homogenization. We made blind tests and detected possible epochs of offsets manually. We found that simulated offsets were easily detected in series with white noise, no influence of seasonal signal was noticed. The autoregressive series were much more problematic when offsets had to be determined. We found few epochs, for which no offset was simulated. This was mainly due to strong autocorrelation of data, which brings an artificial trend within. Due to regime-like behaviour of AR it is difficult for statistical methods to properly detect epochs of offsets, which was previously reported by climatologists.

  5. Homogeneity of Moral Judgment? Apprentices Solving Business Conflicts.

    Science.gov (United States)

    Beck, Klaus; Heinrichs, Karin; Minnameier, Gerhard; Parche-Kawik, Kirsten

    In an ongoing longitudinal study that started in 1994, the moral development of business apprentices is being studied. The focal point of this project is a critical analysis of L. Kohlberg's thesis of homogeneity, according to which people should judge every moral issue from the point of view of their "modal" stage (the most frequently…

  6. A characterization of Markovian homogeneous multicomponent Gaussian fields

    International Nuclear Information System (INIS)

    Ekhaguere, G.O.S.

    1980-01-01

    Necessary and sufficient conditions are given for a certain class of homogeneous multicomponent Gaussian generalized stochastic fields to possess a Markov property equivalent to Nelson's. The class of Markov fields so characterized has a as a cubclass the class of Markov fields which lead by Nelson's Reconstruction Theorem to some covariant (free) quantum fields. (orig.)

  7. Multiscale models in mechano and tumor biology modeling, homogenization, and applications

    CERN Document Server

    Penta, Raimondo; Lang, Jens

    2017-01-01

    This book presents and discusses the state of the art and future perspectives in mathematical modeling and homogenization techniques with the focus on addressing key physiological issues in the context of multiphase healthy and malignant biological materials. The highly interdisciplinary content brings together contributions from scientists with complementary areas of expertise, such as pure and applied mathematicians, engineers, and biophysicists. The book also features the lecture notes from a half-day introductory course on asymptotic homogenization. These notes are suitable for undergraduate mathematics or physics students, while the other chapters are aimed at graduate students and researchers.

  8. Primary healthcare solo practices: homogeneous or heterogeneous?

    Science.gov (United States)

    Pineault, Raynald; Borgès Da Silva, Roxane; Provost, Sylvie; Beaulieu, Marie-Dominique; Boivin, Antoine; Couture, Audrey; Prud'homme, Alexandre

    2014-01-01

    Introduction. Solo practices have generally been viewed as forming a homogeneous group. However, they may differ on many characteristics. The objective of this paper is to identify different forms of solo practice and to determine the extent to which they are associated with patient experience of care. Methods. Two surveys were carried out in two regions of Quebec in 2010: a telephone survey of 9180 respondents from the general population and a postal survey of 606 primary healthcare (PHC) practices. Data from the two surveys were linked through the respondent's usual source of care. A taxonomy of solo practices was constructed (n = 213), using cluster analysis techniques. Bivariate and multilevel analyses were used to determine the relationship of the taxonomy with patient experience of care. Results. Four models were derived from the taxonomy. Practices in the "resourceful networked" model contrast with those of the "resourceless isolated" model to the extent that the experience of care reported by their patients is more favorable. Conclusion. Solo practice is not a homogeneous group. The four models identified have different organizational features and their patients' experience of care also differs. Some models seem to offer a better organizational potential in the context of current reforms.

  9. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Science.gov (United States)

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  10. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    Directory of Open Access Journals (Sweden)

    Preston Donovan

    Full Text Available The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  11. Heats pipes for temperature homogenization: A literature review

    International Nuclear Information System (INIS)

    Blet, Nicolas; Lips, Stéphane; Sartre, Valérie

    2017-01-01

    Highlights: • This paper is a review based on more than sixty references. • The review is sorted into various application fields. • Quantitative values of thermal gradients are compared with and without heat pipes. • Specificities of mentioned heat pipes are compared to other functions of heat pipes. - Abstract: Heat pipes offer high effective heat transfer in a purely passive way. Other specific properties of heat pipes, like temperature homogenization, can be also reached. In this paper, a literature review is carried out in order to investigate the existing heat pipe systems mainly aiming the reduction of temperature gradients. The review gathering more than sixty references is sorted into various application fields, like thermal management of electronics, of storage vessels or of satellites, for which the management of the temperature uniformity differs by the isothermal surface area, temperature ranges or the targeted precision of the temperature flattening. A summary of heat pipe characteristics for this function of temperature homogenization is then performed to identify their specificities, compared to other applications of heat pipes.

  12. Homogeneous instantons in bigravity

    International Nuclear Information System (INIS)

    Zhang, Ying-li; Sasaki, Misao; Yeom, Dong-han

    2015-01-01

    We study homogeneous gravitational instantons, conventionally called the Hawking-Moss (HM) instantons, in bigravity theory. The HM instantons describe the amplitude of quantum tunneling from a false vacuum to the true vacuum. Corrections to General Relativity (GR) are found in a closed form. Using the result, we discuss the following two issues: reduction to the de Rham-Gabadadze-Tolley (dRGT) massive gravity and the possibility of preference for a large e-folding number in the context of the Hartle-Hawking (HH) no-boundary proposal. In particular, concerning the dRGT limit, it is found that the tunneling through the so-called self-accelerating branch is exponentially suppressed relative to the normal branch, and the probability becomes zero in the dRGT limit. As far as HM instantons are concerned, this could imply that the reduction from bigravity to the dRGT massive gravity is ill-defined.

  13. Impact of homogenization of pasteurized human milk on gastric digestion in the preterm infant: A randomized controlled trial.

    Science.gov (United States)

    de Oliveira, Samira C; Bellanger, Amandine; Ménard, Olivia; Pladys, Patrick; Le Gouar, Yann; Henry, Gwénaële; Dirson, Emelyne; Rousseau, Florence; Carrière, Frédéric; Dupont, Didier; Bourlieu, Claire; Deglaire, Amélie

    2017-08-01

    It has been suggested that homogenization of Holder-pasteurized human milk (PHM) could improve fat absorption and weight gain in preterm infants, but the impact on the PHM digestive kinetics has never been studied. Our objective was to determine the impact of PHM homogenization on gastric digestion in preterm infants. In a randomized controlled trial, eight hospitalized tube-fed preterm infants were their own control to compare the gastric digestion of PHM and of homogenized PHM (PHHM). PHM was obtained from donors and, for half of it, was homogenized by ultrasonication. Over a six-day sequence, gastric aspirates were collected twice a day, before and 35, 60 or 90 min after the start of PHM or PHHM ingestion. The impact of homogenization on PHM digestive kinetics and disintegration was tested using a general linear mixed model. Results were expressed as means ± SD. Homogenization leaded to a six-fold increase in the specific surface (P Homogenization increased the gastric lipolysis level (P Homogenization enhanced the proteolysis of serum albumin (P Homogenization of PHM increased the gastric lipolysis level. This could be a potential strategy to improve fat absorption, and thus growth and development in infants fed with PHM; however, its gastrointestinal tolerance needs to be investigated further. This trial was registered at clinicaltrials.gov as NCT02112331. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  14. Mitoxantrone removal by electrochemical method: A comparison of homogenous and heterogenous catalytic reactions

    Directory of Open Access Journals (Sweden)

    Abbas Jafarizad

    2017-08-01

    Full Text Available Background: Mitoxantrone (MXT is a drug for cancer therapy and a hazardous pharmaceutical to the environment which must be removed from contaminated waste streams. In this work, the removal of MXT by the electro-Fenton process over heterogeneous and homogenous catalysts is reported. Methods: The effects of the operational conditions (reaction medium pH, catalyst concentration and utilized current intensity were studied. The applied electrodes were carbon cloth (CC without any processing (homogenous process, graphene oxide (GO coated carbon cloth (GO/CC (homogenous process and Fe3O4@GO nanocomposite coated carbon cloth (Fe3O4@GO/CC (heterogeneous process. The characteristic properties of the electrodes were determined by atomic force microscopy (AFM, field emission scanning electron microscopy (FE-SEM and cathode polarization. MXT concentrations were determined by using ultraviolet-visible (UV-Vis spectrophotometer. Results: In a homogenous reaction, the high concentration of Fe catalyst (>0.2 mM decreased the MXT degradation rate. The results showed that the Fe3O4@GO/CC electrode included the most contact surface. The optimum operational conditions were pH 3.0 and current intensity of 450 mA which resulted in the highest removal efficiency (96.9% over Fe3O4@GO/CC electrode in the heterogeneous process compared with the other two electrodes in a homogenous process. The kinetics of the MXT degradation was obtained as a pseudo-first order reaction. Conclusion: The results confirmed the high potential of the developed method to purify contaminated wastewaters by MXT.

  15. A Test for Parameter Homogeneity in CO{sub 2}Panel EKC Estimations

    Energy Technology Data Exchange (ETDEWEB)

    Dijkgraaf, E. [Erasmus University Rotterdam and SEOR, Rotterdam (Netherlands); Vollebergh, H.R.J. [Department of Economics, Erasmus University Rotterdam, PO Box 1738, 3000 DR Rotterdam (Netherlands)

    2005-10-15

    This paper casts doubt on empirical results based on panel estimations of an 'inverted-U' relationship between per capita GDP and pollution. Using a new dataset for OECD countries on carbon dioxide emissions for the period 1960-1997, we find that the crucial assumption of homogeneity across countries is problematic. Decisively rejected are model specifications that feature even weaker homogeneity assumptions than are commonly used. Furthermore, our results challenge the existence of an overall Environmental Kuznets Curve for carbon dioxide emissions.

  16. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    Science.gov (United States)

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Some low homogenization pressures improve certain probiotic characteristics of yogurt culture bacteria and Lactobacillus acidophilus LA-K.

    Science.gov (United States)

    Muramalla, T; Aryana, K J

    2011-08-01

    Lactobacillus delbrueckii ssp. bulgaricus, Streptococcus salivarius ssp. thermophilus, and Lactobacillus acidophilus are dairy cultures widely used in the manufacture of cultured dairy products. Commonly used homogenization pressures in the dairy industry are 13.80 MPa or less. It is not known whether low homogenization pressures can stimulate bacteria to improve their probiotic characteristics. Objectives were to determine the effect of homogenization at 0, 3.45, 6.90, 10.34, and 13.80 MPa on acid tolerance, bile tolerance, protease activity, and growth of L. delbrueckii ssp. bulgaricus LB-12, S. salivarius ssp. thermophilus ST-M5, and L. acidophilus LA-K. The cultures were individually inoculated in cool autoclaved skim milk (4°C) and homogenized for 5 continuous passes. Growth and bile tolerance of samples were determined hourly for 10h of incubation. Acid tolerance was determined every 20 min for 120 min of incubation. Protease activity was determined at 0, 12, and 24h of incubation. All homogenization pressures studied improved acid tolerance of L. delbrueckii ssp. bulgaricus LB-12 but had no beneficial effect on protease activity and had negative effects on growth and bile tolerance. A pressure of 6.90 MPa improved acid tolerance, bile tolerance, and protease activity of S. salivarius ssp. thermophilus ST-M5, but none of the homogenization pressures studied had an effect on its growth. Homogenization pressures of 13.80 and 6.90 MPa improved acid tolerance and bile tolerance, respectively, of L. acidophilus LA-K but had no effect on protease activity and its growth. Some low homogenization pressures positively influenced some characteristics of yogurt culture bacteria and L. acidophilus LA-K. Culture pretreatment with some low homogenization pressures can be recommended for improvement of certain probiotic characteristics. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Homogenization of a thermo-diffusion system with Smoluchowski interactions

    NARCIS (Netherlands)

    Krehel, O.; Aiki, T.; Muntean, A.

    2014-01-01

    We study the solvability and homogenization of a thermal-diffusion reaction problem posed in a periodically perforated domain. The system describes the motion of populations of hot colloidal particles interacting together via Smoluchowski production terms. The upscaled system, obtained via two-scale

  19. Homogenization of monthly precipitation time series in Croatia

    Czech Academy of Sciences Publication Activity Database

    Zahradníček, Pavel; Rasol, D.; Cindric, K.; Štěpánek, Petr

    2014-01-01

    Roč. 34, č. 14 (2014), s. 3671-3682 ISSN 0899-8418 R&D Projects: GA MŠk(CZ) EE2.3.20.0248; GA MŠk(CZ) EE2.4.31.0056 Institutional support: RVO:67179843 Keywords : homogenization * Croatia * precipitation * inhomogeneities * break points Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.157, year: 2014

  20. DNA Dynamics Studied Using the Homogeneous Balance Method

    International Nuclear Information System (INIS)

    Zayed, E. M. E.; Arnous, A. H.

    2012-01-01

    We employ the homogeneous balance method to construct the traveling waves of the nonlinear vibrational dynamics modeling of DNA. Some new explicit forms of traveling waves are given. It is shown that this method provides us with a powerful mathematical tool for solving nonlinear evolution equations in mathematical physics. Strengths and weaknesses of the proposed method are discussed. (general)

  1. Notes on a class of homogeneous space-times

    International Nuclear Information System (INIS)

    Calvao, M.O.; Reboucas, M.J.; Teixeira, A.F.F.; Silva Junior, W.M.

    1987-01-01

    The breakdown of causality in homogeneous Goedel-type space-time manifolds is examined. An extension of Reboucas-Tiomno (RT) study is made. The existence of noncausal curves is also investigated under two different conditions on the energy-momentum tensor. An integral representation of the infinitesimal generators of isometries is obtained extending previous works on the RT geometry. (Author) [pt

  2. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    Science.gov (United States)

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  3. Microstructure evolution during homogenization of Al–Mn–Fe–Si alloys: Modeling and experimental results

    International Nuclear Information System (INIS)

    Du, Q.; Poole, W.J.; Wells, M.A.; Parson, N.C.

    2013-01-01

    Microstructure evolution during the homogenization heat treatment of Al–Mn–Fe–Si, or AA3xxx, alloys has been investigated using a combination of modeling and experimental studies. The model is fully coupled to CALculation PHAse Diagram (CALPHAD) software and has explicitly taken into account the two different length scales for diffusion encountered in modeling the homogenization process. The model is able to predict the evolution of all the important microstructural features during homogenization, including the inhomogeneous spatial distribution of dispersoids and alloying elements in solution, the dispersoid number density and the size distribution, and the type and fraction of intergranular constituent particles. Experiments were conducted using four direct chill (DC) cast AA3xxx alloys subjected to various homogenization treatments. The resulting microstructures were then characterized using a range of characterization techniques, including optical and electron microscopy, electron micro probe analysis, field emission gun scanning electron microscopy, and electrical resistivity measurements. The model predictions have been compared with the experimental measurements to validate the model. Further, it has been demonstrated that the validated model is able to predict the effects of alloying elements (e.g. Si and Mn) on microstructure evolution. It is concluded that the model provides a time and cost effective tool in optimizing and designing industrial AA3xxx alloy chemistries and homogenization heat treatments

  4. Modification of enzymes by use of high-pressure homogenization.

    Science.gov (United States)

    Dos Santos Aguilar, Jessika Gonçalves; Cristianini, Marcelo; Sato, Helia Harumi

    2018-07-01

    High-pressure is an emerging and relatively new technology that can modify various molecules. High-pressure homogenization (HPH) has been used in several studies on protein modification, especially in enzymes used or found in food, from animal, plant or microbial resources. According to the literature, the enzymatic activity can be modulated under pressure causing inactivation, stabilization or activation of the enzymes, which, depending on the point of view could be very useful. Homogenization can generate changes in the structure of the enzyme modifying various chemical bonds (mainly weak bonds) causing different denaturation levels and, consequently, affecting the catalytic activity. This review aims to describe the various alterations due to HPH treatment in enzymes, to show the influence of high-pressure on proteins and to report the HPH effects on the enzymatic activity of different enzymes employed in the food industry and research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A Correlated Random Effects Model for Non-homogeneous Markov Processes with Nonignorable Missingness.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2013-05-01

    Life history data arising in clusters with prespecified assessment time points for patients often feature incomplete data since patients may choose to visit the clinic based on their needs. Markov process models provide a useful tool describing disease progression for life history data. The literature mainly focuses on time homogeneous process. In this paper we develop methods to deal with non-homogeneous Markov process with incomplete clustered life history data. A correlated random effects model is developed to deal with the nonignorable missingness, and a time transformation is employed to address the non-homogeneity in the transition model. Maximum likelihood estimate based on the Monte-Carlo EM algorithm is advocated for parameter estimation. Simulation studies demonstrate that the proposed method works well in many situations. We also apply this method to an Alzheimer's disease study.

  6. Homogeneity study of fixed-point continuous marine environmental and meteorological data: a review

    Science.gov (United States)

    Yang, Jinkun; Yang, Yang; Miao, Qingsheng; Dong, Mingmei; Wan, Fangfang

    2018-02-01

    The principle of inhomogeneity and the classification of homogeneity test methods are briefly described, and several common inhomogeneity methods and relative merits are described in detail. Then based on the applications of the different homogeneity methods to the ground meteorological data and marine environment data, the present status and the progress are reviewed. At present, the homogeneity research of radiosonde and ground meteorological data is mature at home and abroad, and the research and application in the marine environmental data should also be given full attention. To carry out a variety of test and correction methods combined with the use of multi-mode test system, will make the results more reasonable and scientific, and also can be used to provide accurate first-hand information for the coastal climate change researches.

  7. The Effect of Homogenization on the Corrosion Behavior of Al-Mg Alloy

    Science.gov (United States)

    Li, Yin; Hung, Yuanchun; Du, Zhiyong; Xiao, Zhengbing; Jia, Guangze

    2018-04-01

    The effect of homogenization on the corrosion behavior of 5083-O aluminum alloy is presented in this paper. The intergranular corrosion and exfoliation corrosion were used to characterize the discussed corrosion behavior of 5083-O aluminum alloy. The variations in the morphology, the kind and distribution of the precipitates, and the dislocation configurations in the samples after the homogenization were evaluated using optical microscopy (OM), scanning electron microscopy (SEM), and transmission electron microscopy (TEM). The effects of the highly active grain boundary character distribution and the types of constituent particles on the corrosion are discussed on the basis of experimental observations. The results indicated that the corrosion behavior of 5083-O alloy was closely related to the microstructure obtained by the heat treatment. Homogenization carried out after casting had the optimal effect on the overall corrosion resistance of the material. Nevertheless, all samples could satisfy the requirements of corrosion resistance in marine applications.

  8. Substrate specificity and pH dependence of homogeneous wheat germ acid phosphatase.

    Science.gov (United States)

    Van Etten, R L; Waymack, P P

    1991-08-01

    The broad substrate specificity of a homogeneous isoenzyme of wheat germ acid phosphatase (WGAP) was extensively investigated by chromatographic, electrophoretic, NMR, and kinetic procedures. WGAP exhibited no divalent metal ion requirement and was unaffected upon incubation with EDTA or o-phenanthroline. A comparison of two catalytically homogeneous isoenzymes revealed little difference in substrate specificity. The specificity of WGAP was established by determining the Michaelis constants for a wide variety of substrates. p-Nitrophenyl phosphate, pyrophosphate, tripolyphosphate, and ATP were preferred substrates while lesser activities were seen toward sugar phosphates, trimetaphosphate, phosphoproteins, and (much less) phosphodiesters. An extensive table of Km and Vmax values is given. The pathway for the hydrolysis of trimetaphosphate was examined by colorimetric and 31P NMR methods and it was found that linear tripolyphosphate is not a free intermediate in the enzymatic reaction. In contrast to literature reports, homogeneous wheat germ acid phosphatase exhibits no measurable carboxylesterase activity, nor does it hydrolyze phenyl phosphonothioate esters or phytic acid at significant rates.

  9. II. MORE THAN JUST CONVENIENT: THE SCIENTIFIC MERITS OF HOMOGENEOUS CONVENIENCE SAMPLES.

    Science.gov (United States)

    Jager, Justin; Putnick, Diane L; Bornstein, Marc H

    2017-06-01

    Despite their disadvantaged generalizability relative to probability samples, nonprobability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional (or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. © 2017 The Society for Research in Child Development, Inc.

  10. Improvement of the field homogeneity with a permanent magnet assembly for MRI

    International Nuclear Information System (INIS)

    Sakurai, H.; Aoki, M.; Miyamoto, T.

    1990-01-01

    In the last few years, MRI (Magnetic Resonance imaging) has become one of the most excellent and important radiological and diagnostic methods. For this application, a strong and uniform magnetic field is required in the area where the patient is examined. This requirement for a high order of homogeneity is increasing with the rapid progress of tomographic technology. On the other hand, the cost reduction for the magnet is also strongly required. As reported in the last paper, we developed and mass-produced a permanent type magnet using high energy Nd-Fe-B material. This paper presents a newly developed 15 plane measuring method instead of a 7 plane method to evaluate the homogeneous field precisely. By using this analytical method and linear programing method, a new-shaped pole piece has been developed. In consequence, homogeneity was improved twice as much and the magnet weight was reduced 10 % as compared with the formerly developed pole piece. (author)

  11. Comparison of cell homogenization methods considering interaction effect between fuel cells and control rod cells

    International Nuclear Information System (INIS)

    Takeda, T.; Uto, N.

    1988-01-01

    Several methods to determine cell-averaged group cross sections and anisotropic diffusion coefficients which consider the interaction effect between core fuel cells and control rods or control rod followers have been compared to discuss the physical meaning included in cell homogenization. As the cell homogenization methods considered are the commonly used flux-weighting method, the reaction rate preservation method and the reactivity preservation method. These homogenization methods have been applied to control rod worth calculations in 1-D slab cores to investigate their applicability. (author). 6 refs, 2 figs, 9 tabs

  12. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    Science.gov (United States)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  13. Comparison of different homogenization approaches for elastic–viscoplastic materials

    International Nuclear Information System (INIS)

    Mercier, S; Molinari, A; Berbenni, S; Berveiller, M

    2012-01-01

    Homogenization of linear viscoelastic and non-linear viscoplastic composite materials is considered in this paper. First, we compare two homogenization schemes based on the Mori–Tanaka method coupled with the additive interaction (AI) law proposed by Molinari et al (1997 Mech. Mater. 26 43–62) or coupled with a concentration law based on translated fields (TF) originally proposed for the self-consistent scheme by Paquin et al (1999 Arch. Appl. Mech. 69 14–35). These methods are also evaluated against (i) full-field calculations of the literature based on the finite element method and on fast Fourier transform, (ii) available analytical exact solutions obtained in linear viscoelasticity and (iii) homogenization methods based on variational approaches. Developments of the AI model are obtained for linear and non-linear material responses while results for the TF method are shown for the linear case. Various configurations are considered: spherical inclusions, aligned fibers, hard and soft inclusions, large material contrasts between phases, volume-preserving versus dilatant anelastic flow, non-monotonic loading. The agreement between the AI and TF methods is excellent and the correlation with full field calculations is in general of quite good quality (with some exceptions for non-linear composites with a large volume fraction of very soft inclusions for which a discrepancy of about 15% was found for macroscopic stress). Description of the material behavior with internal variables can be accounted for with the AI and TF approaches and therefore complex loadings can be easily handled in contrast with most hereditary approaches. (paper)

  14. Linear-logarithmic converter of a multi-channel selector-analyser type SA40 for automatic tracing; Convertisseur lineaire logarithmique pour le trace automatique de spectres d'un selecteur SA40

    Energy Technology Data Exchange (ETDEWEB)

    Desmaretz, M; Espanel, P; Ferlicci, R; Feyt, J

    1967-11-01

    The converter described in this note has been built to give the spectra stored in the memory of a type Sa40 selector in semi logarithmic coordinates. It must answer to several functions from numerical information appearing at the output of the selector - to command the address advance of the selector. - to decode numerical information and to transform it in analog tensions. - to operate the linear - logarithmic transformation for the register. - to send an start order to the table servo-motors. [French] L'appareil decrit dans la presente note a ete construit pour delivrer en coordonnees semi-logarithmiques les spectres stockes dans la memoire d'un selecteur type Sa40. Il doit remplir plusieurs fonctions a partir des informations numeriques apparaissant a la sortie parallele du selecteur - Commander l'avance adresse du selecteur. - decoder les informations numeriques et les transformer en tensions analogiques. - operer la transformation lineaire-logarithmique pour le registre. - envoyer un ordre de depart aux servo-moteurs de la table. (auteurs)

  15. Scar Homogenization Versus Limited-Substrate Ablation in Patients With Nonischemic Cardiomyopathy and Ventricular Tachycardia.

    Science.gov (United States)

    Gökoğlan, Yalçın; Mohanty, Sanghamitra; Gianni, Carola; Santangeli, Pasquale; Trivedi, Chintan; Güneş, Mahmut F; Bai, Rong; Al-Ahmad, Amin; Gallinghouse, G Joseph; Horton, Rodney; Hranitzky, Patrick M; Sanchez, Javier E; Beheiry, Salwa; Hongo, Richard; Lakkireddy, Dhanunjaya; Reddy, Madhu; Schweikert, Robert A; Dello Russo, Antonio; Casella, Michela; Tondo, Claudio; Burkhardt, J David; Themistoclakis, Sakis; Di Biase, Luigi; Natale, Andrea

    2016-11-01

    Scar homogenization improves long-term ventricular arrhythmia-free survival compared with standard limited-substrate ablation in patients with post-infarction ventricular tachycardia (VT). Whether such benefit extends to patients with nonischemic cardiomyopathy and scar-related VT is unclear. The aim of this study was to assess the long-term efficacy of an endoepicardial scar homogenization approach compared with standard ablation in this population. Consecutive patients with dilated nonischemic cardiomyopathy (n = 93), scar-related VTs, and evidence of low-voltage regions on the basis of pre-defined criteria on electroanatomic mapping (i.e., bipolar voltage homogenization and standard ablation, respectively (p = 0.01). During a mean follow-up period of 14 ± 2 months, single-procedure success rates were 63.9% after scar homogenization and 38.6% after standard ablation (p = 0.031). After multivariate analysis, scar homogenization and left ventricular ejection fraction were predictors of long-term success. During follow-up, the rehospitalization rate was significantly lower in the scar homogenization group (p = 0.035). In patients with dilated nonischemic cardiomyopathy, scar-related VT, and evidence of low-voltage regions on electroanatomic mapping, endoepicardial homogenization of the scar significantly increased freedom from any recurrent ventricular arrhythmia compared with a standard limited-substrate ablation. However, the success rate with this approach appeared to be lower than previously reported with ischemic cardiomyopathy, presumably because of the septal and midmyocardial distribution of the scar in some patients. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  16. Homogenization of Portuguese long-term temperature data series: Lisbon, Coimbra and Porto

    Directory of Open Access Journals (Sweden)

    A. L. Morozova

    2012-12-01

    Full Text Available Three long-term temperature data series measured in Portugal were studied to detect and correct non-climatic homogeneity breaks and are now available for future studies of climate variability.

    Series of monthly minimum (Tmin and maximum (Tmax temperatures measured in the three Portuguese meteorological stations of Lisbon (from 1856 to 2008, Coimbra (from 1865 to 2005 and Porto (from 1888 to 2001 were studied to detect and correct non-climatic breaks. These series, together with monthly series of average temperature (Taver and temperature range (DTR derived from them, were tested in order to detect breaks, using firstly metadata, secondly a visual analysis, and thirdly four widely used homogeneity tests: von Neumann ratio test, Buishand test, standard normal homogeneity test, and Pettitt test. The homogeneity tests were used in absolute (using temperature series themselves and relative (using sea-surface temperature anomalies series obtained from HadISST2.0.0.0 close to the Portuguese coast or already corrected temperature series as reference series modes. We considered the Tmin, Tmax and DTR series as most informative for the detection of breaks due to the fact that Tmin and Tmax could respond differently to changes in position of a thermometer or other changes in the instrument's environment; Taver series have been used mainly as control.

    The homogeneity tests showed strong inhomogeneity of the original data series, which could have both internal climatic and non-climatic origins. Breaks that were identified by the last three mentioned homogeneity tests were compared with available metadata containing data such as instrument changes, changes in station location and environment, observation procedures, etc. Significant breaks (significance 95% or more that coincided with known dates of

  17. Transient computational homogenization for heterogeneous materials under dynamic excitation

    NARCIS (Netherlands)

    Pham, N.K.H.; Kouznetsova, V.; Geers, M.G.D.

    2013-01-01

    This paper presents a novel transient computational homogenization procedure that is suitable for the modelling of the evolution in space and in time of materials with non-steady state microstructure, such as metamaterials. This transient scheme is an extension of the classical (first-order)

  18. Bounds for nonlinear composites via iterated homogenization

    Science.gov (United States)

    Ponte Castañeda, P.

    2012-09-01

    Improved estimates of the Hashin-Shtrikman-Willis type are generated for the class of nonlinear composites consisting of two well-ordered, isotropic phases distributed randomly with prescribed two-point correlations, as determined by the H-measure of the microstructure. For this purpose, a novel strategy for generating bounds has been developed utilizing iterated homogenization. The general idea is to make use of bounds that may be available for composite materials in the limit when the concentration of one of the phases (say phase 1) is small. It then follows from the theory of iterated homogenization that it is possible, under certain conditions, to obtain bounds for more general values of the concentration, by gradually adding small amounts of phase 1 in incremental fashion, and sequentially using the available dilute-concentration estimate, up to the final (finite) value of the concentration (of phase 1). Such an approach can also be useful when available bounds are expected to be tighter for certain ranges of the phase volume fractions. This is the case, for example, for the "linear comparison" bounds for porous viscoplastic materials, which are known to be comparatively tighter for large values of the porosity. In this case, the new bounds obtained by the above-mentioned "iterated" procedure can be shown to be much improved relative to the earlier "linear comparison" bounds, especially at low values of the porosity and high triaxialities. Consistent with the way in which they have been derived, the new estimates are, strictly, bounds only for the class of multi-scale, nonlinear composites consisting of two well-ordered, isotropic phases that are distributed with prescribed H-measure at each stage in the incremental process. However, given the facts that the H-measure of the sequential microstructures is conserved (so that the final microstructures can be shown to have the same H-measure), and that H-measures are insensitive to length scales, it is conjectured

  19. Homogeneous spectral spanning of terahertz semiconductor lasers with radio frequency modulation.

    Science.gov (United States)

    Wan, W J; Li, H; Zhou, T; Cao, J C

    2017-03-08

    Homogeneous broadband and electrically pumped semiconductor radiation sources emitting in the terahertz regime are highly desirable for various applications, including spectroscopy, chemical sensing, and gas identification. In the frequency range between 1 and 5 THz, unipolar quantum cascade lasers employing electron inter-subband transitions in multiple-quantum-well structures are the most powerful semiconductor light sources. However, these devices are normally characterized by either a narrow emission spectrum due to the narrow gain bandwidth of the inter-subband optical transitions or an inhomogeneous broad terahertz spectrum from lasers with heterogeneous stacks of active regions. Here, we report the demonstration of homogeneous spectral spanning of long-cavity terahertz semiconductor quantum cascade lasers based on a bound-to-continuum and resonant phonon design under radio frequency modulation. At a single drive current, the terahertz spectrum under radio frequency modulation continuously spans 330 GHz (~8% of the central frequency), which is the record for single plasmon waveguide terahertz lasers with a bound-to-continuum design. The homogeneous broadband terahertz sources can be used for spectroscopic applications, i.e., GaAs etalon transmission measurement and ammonia gas identification.

  20. Bounded energy states in homogeneous turbulent shear flow: An alternative view

    Science.gov (United States)

    Bernard, Peter S.; Speziale, Charles G.

    1990-01-01

    The equilibrium structure of homogeneous turbulent shear flow is investigated from a theoretical standpoint. Existing turbulence models, in apparent agreement with physical and numerical experiments, predict an unbounded exponential time growth of the turbulent kinetic energy and dissipation rate; only the anisotropy tensor and turbulent time scale reach a structural equilibrium. It is shown that if vortex stretching is accounted for in the dissipation rate transport equation, then there can exist equilibrium solutions, with bounded energy states, where the turbulence production is balanced by its dissipation. Illustrative calculations are present for a k-epsilon model modified to account for vortex stretching. The calculations indicate an initial exponential time growth of the turbulent kinetic energy and dissipation rate for elapsed times that are as large as those considered in any of the previously conducted physical or numerical experiments on homogeneous shear flow. However, vortex stretching eventually takes over and forces a production-equals-dissipation equilibrium with bounded energy states. The validity of this result is further supported by an independent theoretical argument. It is concluded that the generally accepted structural equilibrium for homogeneous shear flow with unbounded component energies is in need of re-examination.

  1. Influence of the homogenization pressure on the ice cream mix quality

    Directory of Open Access Journals (Sweden)

    Iva Murgić

    2008-08-01

    Full Text Available In this paper the suitability of different homogenization pressures on appearance and quality of ice cream mix was determined. The ice cream mix were taken from ageing tank, and depending on the source of fat in ice cream mix (butter, vegetable fat or cream they were homogenized under different pressures. Afterwards, by microscope with scalar, fat globule size was determined. The homogenization pressures reduce the fat globule size to 1-2 μm without clumping and these pressures have been characterized as adequate pressures for specific type of fat and specific portion of fat in the ice cream mixture. The higher the fat in the mixture, the lower the pressure should be. The optimal pressure for ice cream mixture containing 2% vegetable fat was 200 bars, for 6% 190-200 bars, and for 8% 170 bars. The optimal pressure for ice cream mixture that contained 8% butter was 190-200 bars, for 10% 150, and for 12% 135 bars. For ice cream mixture containing 8% of cream, optimal pressure was 200 bars, 10% cream was 190, 12% cream was 125 bars and 14% cream was 90 bars.

  2. Directed Thermal Diffusions through Metamaterial Source Illusion with Homogeneous Natural Media

    Directory of Open Access Journals (Sweden)

    Guoqiang Xu

    2018-04-01

    Full Text Available Owing to the utilization of transformation optics, many significant research and development achievements have expanded the applications of illusion devices into thermal fields. However, most of the current studies on relevant thermal illusions used to reshape the thermal fields are dependent of certain pre-designed geometric profiles with complicated conductivity configurations. In this paper, we propose a methodology for designing a new class of thermal source illusion devices for achieving directed thermal diffusions with natural homogeneous media. The employments of the space rotations in the linear transformation processes allow the directed thermal diffusions to be independent of the geometric profiles, and the utilization of natural homogeneous media improve the feasibility. Four schemes, with fewer types of homogeneous media filling the functional regions, are demonstrated in transient states. The expected performances are observed in each scheme. The related performance are analyzed by comparing the thermal distribution characteristics and the illusion effectiveness on the measured lines. The findings obtained in this paper see applications in the development of directed diffusions with minimal thermal loss, used in novel “multi-beam” thermal generation, thermal lenses, solar receivers, and waveguide.

  3. The evaporative vector: Homogeneous systems

    International Nuclear Information System (INIS)

    Klots, C.E.

    1987-05-01

    Molecular beams of van der Waals molecules are the subject of much current research. Among the methods used to form these beams, three-sputtering, laser ablation, and the sonic nozzle expansion of neat gases - yield what are now recognized to be ''warm clusters.'' They contain enough internal energy to undergo a number of first-order processes, in particular that of evaporation. Because of this evaporation and its attendant cooling, the properties of such clusters are time-dependent. The states of matter which can be arrived at via an evaporative vector on a typical laboratory time-scale are discussed. Topics include the (1) temperatures, (2) metastability, (3) phase transitions, (4) kinetic energies of fragmentation, and (5) the expression of magical properties, all for evaporating homogeneous clusters

  4. Sizing of type B package tie-downs on the basis of criteria related to hypothetical road transport accident conditions

    International Nuclear Information System (INIS)

    Phalippou, C.

    1986-01-01

    The aim is to guarantee intactness of the type B package containment system under hypothetical road accident conditions. Some experiments performed in France have led to analytical studies taking into account: a) the head-on collision, which is modelised by a uniform deceleration of 35 g, b) the side-on collision, which is modelised by a colliding object 3 times heavier than the package and an impact at 31.9 km/h. In the first case, the adopted criterion is the holding of the package on the vehicle by the strenght of the stowing members (tie-downs and chocks). In the second case, the adopted criterion is the desired breaking of the tie-downs in order to undamage package containment system; therefore it is assumed that no chock is acting against lateral impacts. Analytical and abacus methods have been developed for sizing the strenght of the stowing members in respect with the two above criteria [fr

  5. Simulation of pressurized water reactor in accidental state

    International Nuclear Information System (INIS)

    Chakir, E.

    1994-01-01

    The aim of this work is to develop the 1300 MWe 4 loops 'PWR' simulator called 'SATRAPE', witch the adopted physics modelisation allows a simplified neutronic calculation, and focus essentially on the reactor thermal hydraulic behavior in the case of the following accidents: - Loss of Coolant Accident (LOCA). - Steam Generator Tube Failure (SGTF). - Steam Line Break (SLB). In case of the 'LOCA' or 'SLB' accident, this modelisation enables the calculation of the pressure and the temperature in the containment building, and also the debit of the released dose in this latter in case of the 'LOCA' accident. The adopted models are relatively simple so as to allow an explicit resolve. In SATRAPE, two graphical interfaces enables to launch orders, whereas the other permits to visualize, the principal state variables of installations. The results obtained show a very good consistency with the envisaged commonly scenario at the time of the considered accidents. 33 refs., 52 figs., 1 tab. (author)

  6. Homogenization and isotropization of an inflationary cosmological model

    International Nuclear Information System (INIS)

    Barrow, J.D.; Groen, Oe.; Oslo Univ.

    1986-01-01

    A member of the class of anisotropic and inhomogeneous cosmological models constructed by Wainwright and Goode is investigated. It is shown to describe a universe containing a scalar field which is minimally coupled to gravitation and a positive cosmological constant. It is shown that this cosmological model evolves exponentially rapidly towards the homogeneous and isotropic de Sitter universe model. (orig.)

  7. Gauge freedom in perfect fluid spatially homogeneous spacetimes

    International Nuclear Information System (INIS)

    Jantzen, R.T.

    1983-01-01

    The class of reference systems compatible with the symmetry of a spatially homogeneous perfect fluid spacetime is discussed together with the associated class of symmetry adapted comoving ADM frames (or computational frames). The fluid equations of motion are related to the four functions on the space of fluid flow lines discovered by Taub and which characterize an isentropic flow. (Auth.)

  8. Desertification, salinization, and biotic homogenization in a dryland river ecosystem.

    Science.gov (United States)

    Miyazono, Seiji; Patiño, Reynaldo; Taylor, Christopher M

    2015-04-01

    This study determined long-term changes in fish assemblages, river discharge, salinity, and local precipitation, and examined hydrological drivers of biotic homogenization in a dryland river ecosystem, the Trans-Pecos region of the Rio Grande/Rio Bravo del Norte (USA/Mexico). Historical (1977-1989) and current (2010-2011) fish assemblages were analyzed by rarefaction analysis (species richness), nonmetric multidimensional scaling (composition/variability), multiresponse permutation procedures (composition), and paired t-test (variability). Trends in hydrological conditions (1970s-2010s) were examined by Kendall tau and quantile regression, and associations between streamflow and specific conductance (salinity) by generalized linear models. Since the 1970s, species richness and variability of fish assemblages decreased in the Rio Grande below the confluence with the Rio Conchos (Mexico), a major tributary, but not above it. There was increased representation of lower-flow/higher-salinity tolerant species, thus making fish communities below the confluence taxonomically and functionally more homogeneous to those above it. Unlike findings elsewhere, this biotic homogenization was due primarily to changes in the relative abundances of native species. While Rio Conchos discharge was>2-fold higher than Rio Grande discharge above their confluence, Rio Conchos discharge decreased during the study period causing Rio Grande discharge below the confluence to also decrease. Rio Conchos salinity is lower than Rio Grande salinity above their confluence and, as Rio Conchos discharge decreased, it caused Rio Grande salinity below the confluence to increase (reduced dilution). Trends in discharge did not correspond to trends in precipitation except at extreme-high (90th quantile) levels. In conclusion, decreasing discharge from the Rio Conchos has led to decreasing flow and increasing salinity in the Rio Grande below the confluence. This spatially uneven desertification and

  9. Assessing the homogenization of urban land management with an application to US residential lawn care

    Science.gov (United States)

    Polsky, Colin; Grove, J. Morgan; Knudson, Chris; Groffman, Peter M.; Bettez, Neil; Cavender-Bares, Jeannine; Hall, Sharon J.; Heffernan, James B.; Hobbie, Sarah E.; Larson, Kelli L.; Morse, Jennifer L.; Neill, Christopher; Nelson, Kristen C.; Ogden, Laura A.; O’Neil-Dunne, Jarlath; Pataki, Diane E.; Roy Chowdhury, Rinku; Steele, Meredith K.

    2014-01-01

    Changes in land use, land cover, and land management present some of the greatest potential global environmental challenges of the 21st century. Urbanization, one of the principal drivers of these transformations, is commonly thought to be generating land changes that are increasingly similar. An implication of this multiscale homogenization hypothesis is that the ecosystem structure and function and human behaviors associated with urbanization should be more similar in certain kinds of urbanized locations across biogeophysical gradients than across urbanization gradients in places with similar biogeophysical characteristics. This paper introduces an analytical framework for testing this hypothesis, and applies the framework to the case of residential lawn care. This set of land management behaviors are often assumed—not demonstrated—to exhibit homogeneity. Multivariate analyses are conducted on telephone survey responses from a geographically stratified random sample of homeowners (n = 9,480), equally distributed across six US metropolitan areas. Two behaviors are examined: lawn fertilizing and irrigating. Limited support for strong homogenization is found at two scales (i.e., multi- and single-city; 2 of 36 cases), but significant support is found for homogenization at only one scale (22 cases) or at neither scale (12 cases). These results suggest that US lawn care behaviors are more differentiated in practice than in theory. Thus, even if the biophysical outcomes of urbanization are homogenizing, managing the associated sustainability implications may require a multiscale, differentiated approach because the underlying social practices appear relatively varied. The analytical approach introduced here should also be productive for other facets of urban-ecological homogenization. PMID:24616515

  10. Environmental Kuznets Curves for CO2 : Heterogeneity Versus Homogeneity

    NARCIS (Netherlands)

    Vollebergh, H.R.J.; Dijkgraaf, E.; Melenberg, B.

    2005-01-01

    We explore the emissions income relationship for CO2 in OECD countries using various modelling strategies.Even for this relatively homogeneous sample, we find that the inverted-U-shaped curve is quite sensitive to the degree of heterogeneity included in the panel estimations.This finding is robust,

  11. Nano-catalysts: Bridging the gap between homogeneous and heterogeneous catalysis

    Science.gov (United States)

    Functionalized nanoparticles have emerged as sustainable alternatives to conventional materials, as robust, high-surface-area heterogeneous catalyst supports. We envisioned a catalyst system, which can bridge the homogenous and heterogeneous system. Postsynthetic surface modifica...

  12. Using homogenization, sonication and thermo-sonication to inactivate fungi

    Science.gov (United States)

    Bevilacqua, Antonio; Sinigaglia, Milena; Corbo, Maria Rosaria

    2016-01-01

    Ultrasound (US), Thermo-sonication (TS) and High Pressure Homogenization (HPH) were studied as tools to inactivate the spores of Penicillium spp. and Mucor spp. inoculated in distilled water. For US, the power ranged from 40% to 100%, pulse from 2 to 10 s, and duration of the treatment from 2 to 10 min. TS was performed combining US (40–80% of power, for 8 min and pulse of 2 s) with a thermal treatment (50, 55 and 60°C at 4, 8 and 12 min). Homogenization was done at 30–150 MPa for 1, 2 and 3 times. Power was the most important factors to determine the antifungal effect of US and TS towards the conidia of Penicillium spp.; on the other hand, in US treatments Mucor spp. was also affected by pulse and time. HPH exerted a significant antifungal effect only if the highest pressures were applied for 2–3 times. PMID:27375964

  13. Homogenization techniques for population dynamics in strongly heterogeneous landscapes.

    Science.gov (United States)

    Yurk, Brian P; Cobbold, Christina A

    2018-12-01

    An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.

  14. Parametric dependence of two-plasmon decay in homogeneous plasma

    International Nuclear Information System (INIS)

    Dimitrijevic, Dejan R

    2010-01-01

    A hydrodynamic model of two-plasmon decay in a homogeneous plasma slab near the quarter-critical density is constructed in order to improve our understanding of the spatio-temporal evolution of the daughter electron plasma waves in plasma in the course of the instability. The scaling of the amplitudes of the participating waves with laser and plasma parameters is investigated. The secondary coupling of two daughter electron plasma waves with an ion-acoustic wave is assumed to be the principal mechanism of saturation of the instability. The impact of the inherently nonresonant nature of this secondary coupling on the development of two-plasmon decay is researched and it is shown to significantly influence the electron plasma wave dynamics. Its inclusion leads to nonuniformity of the spatial profile of the instability and causes the burst-like pattern of the instability development, which should result in the burst-like hot-electron production in homogeneous plasma.

  15. Neutron transport equation - indications on homogenization and neutron diffusion

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1992-06-01

    In PWR nuclear reactor, the practical study of the neutrons in the core uses diffusion equation to describe the problem. On the other hand, the most correct method to describe these neutrons is to use the Boltzmann equation, or neutron transport equation. In this paper, we give some theoretical indications to obtain a diffusion equation from the general transport equation, with some simplifying hypothesis. The work is organised as follows: (a) the most general formulations of the transport equation are presented: integro-differential equation and integral equation; (b) the theoretical approximation of this Boltzmann equation by a diffusion equation is introduced, by the way of asymptotic developments; (c) practical homogenization methods of transport equation is then presented. In particular, the relationships with some general and useful methods in neutronic are shown, and some homogenization methods in energy and space are indicated. A lot of other points of view or complements are detailed in the text or the remarks

  16. Averaging principle for second-order approximation of heterogeneous models with homogeneous models.

    Science.gov (United States)

    Fibich, Gadi; Gavious, Arieh; Solan, Eilon

    2012-11-27

    Typically, models with a heterogeneous property are considerably harder to analyze than the corresponding homogeneous models, in which the heterogeneous property is replaced by its average value. In this study we show that any outcome of a heterogeneous model that satisfies the two properties of differentiability and symmetry is O(ε(2)) equivalent to the outcome of the corresponding homogeneous model, where ε is the level of heterogeneity. We then use this averaging principle to obtain new results in queuing theory, game theory (auctions), and social networks (marketing).

  17. Averaging principle for second-order approximation of heterogeneous models with homogeneous models

    Science.gov (United States)

    Fibich, Gadi; Gavious, Arieh; Solan, Eilon

    2012-01-01

    Typically, models with a heterogeneous property are considerably harder to analyze than the corresponding homogeneous models, in which the heterogeneous property is replaced by its average value. In this study we show that any outcome of a heterogeneous model that satisfies the two properties of differentiability and symmetry is O(ɛ2) equivalent to the outcome of the corresponding homogeneous model, where ɛ is the level of heterogeneity. We then use this averaging principle to obtain new results in queuing theory, game theory (auctions), and social networks (marketing). PMID:23150569

  18. A homogeneous focusing system for diode lasers and its applications in metal surface modification

    Science.gov (United States)

    Wang, Fei; Zhong, Lijing; Tang, Xiahui; Xu, Chengwen; Wan, Chenhao

    2018-06-01

    High power diode lasers are applied in many different areas, including surface modification, welding and cutting. It is an important technical trend in laser processing of metals in the future. This paper aims to analyze the impact of the shape and homogeneity of the focal spot of the diode laser on surface modification. A focusing system using the triplet lenses for a direct output diode laser which can be used to eliminate coma aberrations is studied. A rectangular stripe with an aspect ratio from 8:1 to 25:1 is obtained, in which the power is homogeneously distributed along the fast axis, the power is 1117.6 W and the peak power intensity is 1.1587 × 106 W/cm2. This paper also presents a homogeneous focusing system by use of a Fresnel lens, in which the incident beam size is 40 × 40 mm2, the focal length is 380 mm, and the dimension of the obtained focal spot is 2 × 10 mm2. When the divergence angle of the incident light is in the range of 12.5-20 mrad and the pitch is 1 mm, the obtained homogeneity in the focal spot is the optimum (about 95.22%). Experimental results show that the measured focal spot size is 2.04 × 10.39 mm2. This research presents a novel design of homogeneous focusing systems for high power diode lasers.

  19. Investigation of the homogeneity of methacrylate allergens in commercially available patch test preparations

    DEFF Research Database (Denmark)

    Mose, Kristian Fredløv; Andersen, Klaus Ejner; Christensen, Lars Porskjaer

    2013-01-01

    The homogeneity of methacrylates in commercial patch test preparations has not yet been investigated. Inhomogeneous patch test preparations may give rise to false-negative or false-positive patch test results in patients suspected of having methacrylate allergy.......The homogeneity of methacrylates in commercial patch test preparations has not yet been investigated. Inhomogeneous patch test preparations may give rise to false-negative or false-positive patch test results in patients suspected of having methacrylate allergy....

  20. For the criticality of water reflected homogeneous arrays and heterogeneous reactor fuel elements

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Hj; Rabitsch, H; Schuerrer, F [Technische Univ., Graz (Austria). Inst. fuer Theoretische Physik und Reaktorphysik

    1980-01-01

    The smallest critical masses for fuel elements of research reactors having a medium and high enrichment are calculated. The results fit close on the known critical masses of power reactors with low enrichment. The comparison of the critical masses of reactor fuel elements and homogenized uranium dioxide water systems yields the influence of the homogeneity and of the cladding on the criticality. A coefficient for heterogeneity is suggested which takes into consideration these influences.