WorldWideScience

Sample records for fully developed homogeneous

  1. Three-dimensional single-channel thermal analysis of fully ceramic microencapsulated fuel via two-temperature homogenized model

    International Nuclear Information System (INIS)

    Lee, Yoonhee; Cho, Nam Zin

    2014-01-01

    Highlights: • Two-temperature homogenized model is applied to thermal analysis of fully ceramic microencapsulated (FCM) fuel. • Based on the results of Monte Carlo calculation, homogenized parameters are obtained. • 2-D FEM/1-D FDM hybrid method for the model is used to obtain 3-D temperature profiles. • The model provides the fuel-kernel and SiC matrix temperatures separately. • Compared to UO 2 fuel, the FCM fuel shows ∼560 K lower maximum temperatures at steady- and transient states. - Abstract: The fully ceramic microencapsulated (FCM) fuel, one of the accident tolerant fuel (ATF) concepts, consists of TRISO particles randomly dispersed in SiC matrix. This high heterogeneity in compositions leads to difficulty in explicit thermal calculation of such a fuel. For thermal analysis of a fuel element of very high temperature reactors (VHTRs) which has a similar configuration to FCM fuel, two-temperature homogenized model was recently proposed by the authors. The model was developed using particle transport Monte Carlo method for heat conduction problems. It gives more realistic temperature profiles, and provides the fuel-kernel and graphite temperatures separately. In this paper, we apply the two-temperature homogenized model to three-dimensional single-channel thermal analysis of the FCM fuel element for steady- and transient-states using 2-D FEM/1-D FDM hybrid method. In the analyses, we assume that the power distribution is uniform in radial direction at steady-state and that in axial direction it is in the form of cosine function for simplicity. As transient scenarios, we consider (i) coolant inlet temperature transient, (ii) inlet mass flow rate transient, and (iii) power transient. The results of analyses are compared to those of conventional UO 2 fuel having the same geometric dimension and operating conditions

  2. 76 FR 36176 - Fully Developed Claim (Fully Developed Claims-Applications for Compensation, Pension, DIC, Death...

    Science.gov (United States)

    2011-06-21

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0747] Fully Developed Claim (Fully Developed Claims--Applications for Compensation, Pension, DIC, Death Pension, and/or Accrued Benefits); Correction AGENCY: Veterans Benefits Administration, Department of Veterans Affairs. ACTION: Notice; correction...

  3. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  4. Nuclear-Thermal Analysis of Fully Ceramic Microencapsulated Fuel via Two-Temperature Homogenized Model

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Nam Zin

    2013-01-01

    The FCM fuel is based on a proven safety philosophy that has been utilized operationally in very high temperature reactors (VHTRs). However, the FCM fuel consists of TRISO particles randomly dispersed in SiC matrix. The high heterogeneity in composition leads to difficulty in explicit thermal calculation of such a fuel. Therefore, an appropriate homogenization model becomes essential. In this paper, we apply the two-temperature homogenized model to thermal analysis of an FCM fuel. The model was recently proposed in order to provide more realistic temperature profiles in the fuel element in VHTRs. We applied the two-temperature homogenized model to FCM fuel. The two-temperature homogenized model was obtained by particle transport Monte Carlo calculation applied to the pellet region consisting of many coated particles uniformly dispersed in SiC matrix. Since this model gives realistic temperature profiles in the pellet (providing fuel-kernel temperature and SiC matrix temperature distinctly), it can be used for more accurate neutronics evaluation such as Doppler temperature feedback. The transient thermal calculation may be performed also more realistically with temperature-dependent homogenized parameters in various scenarios

  5. Conjecture on the critical frontier of the fully anisotropic homogeneous quenched bond-mixed potts ferromagnet in square lattice

    International Nuclear Information System (INIS)

    Tsallis, C.

    1980-01-01

    It is conjectured that a logarithmic provides a very accurate approximation of the yet unknown critical frontier of a fully anisotropic homogeneous quenched bond-mixed q-state Potts ferromagnet in square lattice, where the random coupling constant J is distributed according to the laws P(J) and P'(J) for 'horizontal' and 'vertical' bonds respectively. Such an equation contains as particular cases a great number of exact results as well as a few recent conjectures (which are definitively only approximate). (Author) [pt

  6. Experimental optimization of a direct injection homogeneous charge compression ignition gasoline engine using split injections with fully automated microgenetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Canakci, M. [Kocaeli Univ., Izmit (Turkey); Reitz, R.D. [Wisconsin Univ., Dept. of Mechanical Engineering, Madison, WI (United States)

    2003-03-01

    Homogeneous charge compression ignition (HCCI) is receiving attention as a new low-emission engine concept. Little is known about the optimal operating conditions for this engine operation mode. Combustion under homogeneous, low equivalence ratio conditions results in modest temperature combustion products, containing very low concentrations of NO{sub x} and particulate matter (PM) as well as providing high thermal efficiency. However, this combustion mode can produce higher HC and CO emissions than those of conventional engines. An electronically controlled Caterpillar single-cylinder oil test engine (SCOTE), originally designed for heavy-duty diesel applications, was converted to an HCCI direct injection (DI) gasoline engine. The engine features an electronically controlled low-pressure direct injection gasoline (DI-G) injector with a 60 deg spray angle that is capable of multiple injections. The use of double injection was explored for emission control and the engine was optimized using fully automated experiments and a microgenetic algorithm optimization code. The variables changed during the optimization include the intake air temperature, start of injection timing and the split injection parameters (per cent mass of fuel in each injection, dwell between the pulses). The engine performance and emissions were determined at 700 r/min with a constant fuel flowrate at 10 MPa fuel injection pressure. The results show that significant emissions reductions are possible with the use of optimal injection strategies. (Author)

  7. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    using an object oriented programming (OOP) approach. It includes a FAR engine to control the operation of the perception-action (PA) cycle and...is unlimited 41 NATO North Atlantic Treaty Organization OOP object oriented programming OSU The Ohio State University PA perception-action PDF...development and testing on simulated, previously collected, and real-time streaming data. The architecture is coded in MATLAB using an object oriented

  8. Fully developed turbulence via Feigenbaum's period-doubling bifurcations

    International Nuclear Information System (INIS)

    Duong-van, M.

    1987-08-01

    Since its publication in 1978, Feigenbaum's predictions of the onset of turbulence via period-doubling bifurcations have been thoroughly borne out experimentally. In this paper, Feigenbaum's theory is extended into the regime in which we expect to see fully developed turbulence. We develop a method of averaging that imposes correlations in the fluctuating system generated by this map. With this averaging method, the field variable is obtained by coarse-graining, while microscopic fluctuations are preserved in all averaging scales. Fully developed turbulence will be shown to be a result of microscopic fluctuations with proper averaging. Furthermore, this model preserves Feigenbaum's results on the physics of bifurcations at the onset of turbulence while yielding additional physics both at the onset of turbulence and in the fully developed turbulence regime

  9. Development of a fully injectable calcium phosphate cement

    Indian Academy of Sciences (India)

    Permanent link: https://www.ias.ac.in/article/fulltext/boms/026/04/0415-0422. Keywords. Calcium phosphate cements; hydroxyapatite; bioceramics; bone substitute; orthopedic; dental. Abstract. A study on the development of a fully injectable calcium phosphate cement for orthopedic and dental applications is presented.

  10. Fully depleted back-illuminated p-channel CCD development

    Energy Technology Data Exchange (ETDEWEB)

    Bebek, Chris J.; Bercovitz, John H.; Groom, Donald E.; Holland, Stephen E.; Kadel, Richard W.; Karcher, Armin; Kolbe, William F.; Oluseyi, Hakeem M.; Palaio, Nicholas P.; Prasad, Val; Turko, Bojan T.; Wang, Guobin

    2003-07-08

    An overview of CCD development efforts at Lawrence Berkeley National Laboratory is presented. Operation of fully-depleted, back-illuminated CCD's fabricated on high resistivity silicon is described, along with results on the use of such CCD's at ground-based observatories. Radiation damage and point-spread function measurements are described, as well as discussion of CCD fabrication technologies.

  11. Steady- and transient-state analysis of fully ceramic microencapsulated fuel with randomly dispersed tristructural isotropic particles via two-temperature homogenized model-I: Theory and method

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin

    2016-01-01

    As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM) fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC) matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1) matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2) preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1) they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2) they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained

  12. Non-homogeneous harmonic analysis: 16 years of development

    Science.gov (United States)

    Volberg, A. L.; Èiderman, V. Ya

    2013-12-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles.

  13. Fully-developed heat transfer in annuli with viscous dissipation

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, P.M. [Universidade do Porto, Porto (Portugal). Centro de Estudos de Fenomenos de Transporte, DEMEGI, Faculdade de Engenharia; Pinho, F.T. [Universidade do Porto, Porto (Portugal). Centro de Estudos de Fenomenos de Transporte, Faculdade de Engenharia

    2006-09-15

    For Newtonian concentric annular flows analytical solutions are obtained under imposed asymmetric constant wall heat fluxes as well as under imposed asymmetric constant wall temperatures, taking into account viscous dissipation and for fluid dynamic and thermally fully-developed conditions. Results for the special case of the heat flux ratio for identical wall temperatures and the critical Brinkman numbers marking changes of sign in wall heat fluxes are also derived. Equations are presented for the Nusselt numbers at the inner and outer walls, bulk temperature and normalised temperature distribution as a function of all relevant non-dimensional numbers. Given the complexity of the derived equations, simpler exact expressions are presented for the Nusselt numbers for ease of use, with their coefficients given in tables as a function of the radius ratio. (author)

  14. Torsion effect on fully developed flow in a helical pipe

    Science.gov (United States)

    Kao, Hsiao C.

    1987-01-01

    Two techniques, a series expansion method of perturbed Poiseuille flow valid for low Dean numbers and a solution of the complete Navier-Stokes equation applicable to intermediate Dean values, are used to investigate the torsion effect on the fully developed laminar flow in a helical pipe of constant circular cross section. For the secondary flow patterns, the results show that the presence of torsion can produce a significant effect if the ratio of the curvature to the torsion is of order unity. The secondary flow is distorted in these cases. It is noted that the torsion effect is, however, usually small, and that the secondary flow has the usual pattern of a pair of counter-rotating vortices of nearly equal strength.

  15. Steady- and transient-state analyses of fully ceramic microencapsulated fuel loaded reactor core via two-temperature homogenized thermal-conductivity model

    International Nuclear Information System (INIS)

    Lee, Yoonhee; Cho, Nam Zin

    2015-01-01

    Highlights: • Fully ceramic microencapsulated fuel-loaded core is analyzed via a two-temperature homogenized thermal-conductivity model. • The model is compared to harmonic- and volumetric-average thermal conductivity models. • The three thermal analysis models show ∼100 pcm differences in the k eff eigenvalue. • The three thermal analysis models show more than 70 K differences in the maximum temperature. • There occur more than 3 times differences in the maximum power for a control rod ejection accident. - Abstract: Fully ceramic microencapsulated (FCM) fuel, a type of accident-tolerant fuel (ATF), consists of TRISO particles randomly dispersed in a SiC matrix. In this study, for a thermal analysis of the FCM fuel with such a high heterogeneity, a two-temperature homogenized thermal-conductivity model was applied by the authors. This model provides separate temperatures for the fuel-kernels and the SiC matrix. It also provides more realistic temperature profiles than those of harmonic- and volumetric-average thermal conductivity models, which are used for thermal analysis of a fuel element in VHTRs having a composition similar to the FCM fuel, because such models are unable to provide the fuel-kernel and graphite matrix temperatures separately. In this study, coupled with a neutron diffusion model, a FCM fuel-loaded reactor core is analyzed via a two-temperature homogenized thermal-conductivity model at steady- and transient-states. The results are compared to those from harmonic- and volumetric-average thermal conductivity models, i.e., we compare k eff eigenvalues, power distributions, and temperature profiles in the hottest single-channel at steady-state. At transient-state, we compare total powers, reactivity, and maximum temperatures in the hottest single-channel obtained by the different thermal analysis models. The different thermal analysis models and the availability of fuel-kernel temperatures in the two-temperature homogenized thermal

  16. Development of Low Energy Gap and Fully Regioregular Polythienylenevinylene Derivative

    Directory of Open Access Journals (Sweden)

    Tanya M. S. David

    2014-01-01

    Full Text Available Low energy gap and fully regioregular conjugated polymers find its wide use in solar energy conversion applications. This paper will first briefly review this type of polymers and also report synthesis and characterization of a specific example new polymer, a low energy gap, fully regioregular, terminal functionalized, and processable conjugated polymer poly-(3-dodecyloxy-2,5-thienylene vinylene or PDDTV. The polymer exhibited an optical energy gap of 1.46 eV based on the UV-vis-NIR absorption spectrum. The electrochemically measured highest occupied molecular orbital (HOMO level is −4.79 eV, resulting in the lowest unoccupied molecular orbital (LUMO level of −3.33 eV based on optical energy gap. The polymer was synthesized via Horner-Emmons condensation and is fairly soluble in common organic solvents such as tetrahydrofuran and chloroform with gentle heating. DSC showed two endothermic peaks at 67°C and 227°C that can be attributed to transitions between crystalline and liquid states. The polymer is thermally stable up to about 300°C. This polymer appears very promising for cost-effective solar cell applications.

  17. Preliminary development of thermal nuclear cell homogenization code

    International Nuclear Information System (INIS)

    Su'ud, Z.; Shafii, M. A.; Yudha, S. P.; Waris, A.; Rijal, K.

    2012-01-01

    Nuclear fuel cell homogenization for thermal reactors usually include three main parts, i.e., fast energy resonance part which usually adopt narrow resonance approximation to treat the resonance, low (intermediate) energy region in which the resonance can not be treated accurately using NR approximation and therefore we should use intermediate resonance treatment, and thermal energy region (very low) in which the effect of thermal must be treated properly. In n this study the application of the intermediate resonance approximation treatment for low energy nuclear resonance is discussed. The method is iterative based. As a sample the method is applied in U-235 low lying resonance and the result is presented and discussed.

  18. Fully developed MHD turbulence near critical magnetic Reynolds number

    International Nuclear Information System (INIS)

    Leorat, J.; Pouquet, A.; Frisch, U.

    1981-01-01

    Liquid-sodium-cooled breeder reactors may soon be operating at magnetic Reynolds numbers Rsup(M) where magnetic fields can be self-excited by a dynamo mechanism. Such flows have kinetic Reynolds numbers Rsup(V) of the order of 10 7 and are therefore highly turbulent. The behaviour of MHD turbulence with high Rsup(V) and low magnetic Prandtl numbers is investigated, using the eddy-damped quasi-normal Markovian closure applied to the MHD equations. For simplicity the study is restricted to homogeneous and isotropic turbulence, but includes helicity. A critical magnetic Reynolds number Rsub(c)sup(M) of the order of a few tens (non-helical case) is obtained above which magnetic energy is present. Rsub(c)sup(M) is practically independent of Rsup(V) (in the range 40 to 10 6 ) and can be considerably decreased by the presence of helicity. No attempt is made to obtain quantitative estimates for a breeder reactor, but discuss some of the possible consequences of exceeding Rsub(c)sup(M) such as decreased turbulent heat transport. (author)

  19. Not Fully Developed Turbulence in the Dow Jones Index

    Science.gov (United States)

    Trincado, Estrella; Vindel, Jose María

    2013-08-01

    The shape of the curves relating the scaling exponents of the structure functions to the order of these functions is shown to distinguish the Dow Jones index from other stock market indices. We conclude from the shape differences that the information-loss rate for the Dow Jones index is reduced at smaller time scales, while it grows for other indices. This anomaly is due to the construction of the index, in particular to its dependence on a single market parameter: price. Prices are subject to turbulence bursts, which act against full development of turbulence.

  20. Fully depleted CMOS pixel sensor development and potential applications

    Energy Technology Data Exchange (ETDEWEB)

    Baudot, J.; Kachel, M. [Universite de Strasbourg, IPHC, 23 rue du Loess 67037 Strasbourg (France); CNRS, UMR7178, 67037 Strasbourg (France)

    2015-07-01

    low noise figure. Especially, an energy resolution of about 400 eV for 5 keV X-rays was obtained for single pixels. The prototypes have then been exposed to gradually increased fluences of neutrons, from 10{sup 13} to 5x10{sup 14} neq/cm{sup 2}. Again laboratory tests allowed to evaluate the signal over noise persistence on the different pixels implemented. Currently our development mostly targets the detection of soft X-rays, with the ambition to develop a pixel sensor matching counting rates as affordable with hybrid pixel sensors, but with an extended sensitivity to low energy and finer pixel about 25 x 25 μm{sup 2}. The original readout architecture proposed relies on a two tiers chip. The first tier consists of a sensor with a modest dynamic in order to insure low noise performances required by sensitivity. The interconnected second tier chip enhances the read-out speed by introducing massive parallelization. Performances reachable with this strategy combining counting and integration will be detailed. (authors)

  1. Steady- and transient-state analysis of fully ceramic microencapsulated fuel with randomly dispersed tristructural isotropic particles via two-temperature homogenized model-II: Applications by coupling with COREDAX

    International Nuclear Information System (INIS)

    Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin

    2016-01-01

    In Part I of this paper, the two-temperature homogenized model for the fully ceramic microencapsulated fuel, in which tristructural isotropic particles are randomly dispersed in a fine lattice stochastic structure, was discussed. In this model, the fuel-kernel and silicon carbide matrix temperatures are distinguished. Moreover, the obtained temperature profiles are more realistic than those obtained using other models. Using the temperature-dependent thermal conductivities of uranium nitride and the silicon carbide matrix, temperature-dependent homogenized parameters were obtained. In Part II of the paper, coupled with the COREDAX code, a reactor core loaded by fully ceramic microencapsulated fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure is analyzed via a two-temperature homogenized model at steady and transient states. The results are compared with those from harmonic- and volumetric-average thermal conductivity models; i.e., we compare keff eigenvalues, power distributions, and temperature profiles in the hottest single channel at a steady state. At transient states, we compare total power, average energy deposition, and maximum temperatures in the hottest single channel obtained by the different thermal analysis models. The different thermal analysis models and the availability of fuel-kernel temperatures in the two-temperature homogenized model for Doppler temperature feedback lead to significant differences

  2. Multifractal aspects of the scaling laws in fully developed compressible turbulence

    International Nuclear Information System (INIS)

    Shivamoggi, B.K.

    1995-01-01

    In this paper, multifractal aspects of the scalings laws in fully developed compressible turbulence are considered. Compressibility effects on known results of incompressible turbulence are pointed out. copyright 1995 Academic Press, Inc

  3. Steady- and Transient-State Analyses of Fully Ceramic Microencapsulated Fuel with Randomly Dispersed Tristructural Isotropic Particles via Two-Temperature Homogenized Model—I: Theory and Method

    Directory of Open Access Journals (Sweden)

    Yoonhee Lee

    2016-06-01

    Full Text Available As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1 matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2 preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1 they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2 they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained.

  4. Development of dynamic explicit crystallographic homogenization finite element analysis code to assess sheet metal formability

    International Nuclear Information System (INIS)

    Nakamura, Yasunori; Tam, Nguyen Ngoc; Ohata, Tomiso; Morita, Kiminori; Nakamachi, Eiji

    2004-01-01

    The crystallographic texture evolution induced by plastic deformation in the sheet metal forming process has a great influence on its formability. In the present study, a dynamic explicit finite element (FE) analysis code is newly developed by introducing a crystallographic homogenization method to estimate the polycrystalline sheet metal formability, such as the extreme thinning and 'earing'. This code can predict the plastic deformation induced texture evolution at the micro scale and the plastic anisotropy at the macro scale, simultaneously. This multi-scale analysis can couple the microscopic crystal plasticity inhomogeneous deformation with the macroscopic continuum deformation. In this homogenization process, the stress at the macro scale is defined by the volume average of those of the corresponding microscopic crystal aggregations in satisfying the equation of motion and compatibility condition in the micro scale 'unit cell', where the periodicity of deformation is satisfied. This homogenization algorithm is implemented in the conventional dynamic explicit finite element code by employing the updated Lagrangian formulation and the rate type elastic/viscoplastic constitutive equation.At first, it has been confirmed through a texture evolution analyses in cases of typical deformation modes that Taylor's 'constant strain homogenization algorithm' yields extreme concentration toward the preferred crystal orientations compared with our homogenization one. Second, we study the plastic anisotropy effects on 'earing' in the hemispherical cup deep drawing process of pure ferrite phase sheet metal. By the comparison of analytical results with those of Taylor's assumption, conclusions are drawn that the present newly developed dynamic explicit crystallographic homogenization FEM shows more reasonable prediction of plastic deformation induced texture evolution and plastic anisotropy at the macro scale

  5. Fully-developed conjugate heat transfer in porous media with uniform heating

    NARCIS (Netherlands)

    Lopez Penha, D.J.; Stolz, S.; Kuerten, Johannes G.M.; Nordlund, M.; Kuczaj, Arkadiusz K.; Geurts, Bernardus J.

    2012-01-01

    We propose a computational method for approximating the heat transfer coefficient of fully-developed flow in porous media. For a representative elementary volume of the porous medium we develop a transport model subject to periodic boundary conditions that describes incompressible fluid flow through

  6. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  7. Hydrodynamics of piston-driven laminar pulsating flow: Part 2. Fully developed flow

    International Nuclear Information System (INIS)

    Aygun, Cemalettin; Aydin, Orhan

    2014-01-01

    Highlights: • The piston-driven laminar pulsating flow in a pipe is studied. • Fully developed flow is examined analytically, numerically and experimentally. • An increase in F results an increase in the amplitude of the centerline velocity. • The characters of the radial velocity profiles critically depend on both the frequency and the phase angle. • The near/off-wall flow reversals are observed for F = 105, 226 and 402. - Abstract: Piston-driven pulsating flow is a specific type of pressure-driven pulsating flows. In this study, piston-driven laminar pulsating flow in a pipe is studied. This study mainly exists of two parts: developing flow and fully developed flow. In this part, hydrodynamically fully developed flow is examined analytically, numerically and experimentally. A constant value of the time-averaged Reynolds number is considered, Re = 1000. In the theoretical studies, both analytical and numerical, an inlet velocity profile representing the experimental case, i.e., the piston driven flow, is assumed. In the experiments, in the hydrodynamically fully developed region, radial velocity distribution and pressure drop are obtained using hot-wire anemometer and pressure transmitter, respectively. The effect pulsation frequency on the friction coefficient as well as velocity profiles are obtained. A good agreement is observed among analytical, numerical and experimental results

  8. Multifractal scaling at the Kolmogorov microscale in fully developed compressible turbulence

    International Nuclear Information System (INIS)

    Shivamoggi, B.K.

    1995-01-01

    In this paper, some aspects of multifractal scaling at the Kolmogorov microscale in fully developed compressible turbulence are considered. These considerations, on the one hand, provide an insight into the mechanism of compressible turbulence, and on the other hand enable one to determine the robustness of some known results in incompressible turbulence. copyright 1995 Academic Press, Inc

  9. A technique for sexing fully developed embryos and early-instar larvae of the gypsy moth

    Science.gov (United States)

    Gilbert Levesque

    1963-01-01

    Because variation in sex ratio is an important factor in the population dynamics of the gypsy moth (Porthetria dispar), it is necessary to have some means of determining the ratio of males to females in a population at the beginning of the larval period as well as in the later stages. For determining the sex of fully developed embryos and early-...

  10. Research and Development of Fully Automatic Alien Smoke Stack and Packaging System

    Science.gov (United States)

    Yang, Xudong; Ge, Qingkuan; Peng, Tao; Zuo, Ping; Dong, Weifu

    2017-12-01

    The problem of low efficiency of manual sorting packaging for the current tobacco distribution center, which developed a set of safe efficient and automatic type of alien smoke stack and packaging system. The functions of fully automatic alien smoke stack and packaging system adopt PLC control technology, servo control technology, robot technology, image recognition technology and human-computer interaction technology. The characteristics, principles, control process and key technology of the system are discussed in detail. Through the installation and commissioning fully automatic alien smoke stack and packaging system has a good performance and has completed the requirements for shaped cigarette.

  11. Iron Oxide Nanoparticle-Based Magnetic Ink Development for Fully Printed Tunable Radio-Frequency Devices

    KAUST Repository

    Vaseem, Mohammad

    2018-01-30

    The field of printed electronics is still in its infancy and most of the reported work is based on commercially available nanoparticle-based metallic inks. Although fully printed devices that employ dielectric/semiconductor inks have recently been reported, there is a dearth of functional inks that can demonstrate controllable devices. The lack of availability of functional inks is a barrier to the widespread use of fully printed devices. For radio-frequency electronics, magnetic materials have many uses in reconfigurable components but rely on expensive and rigid ferrite materials. A suitable magnetic ink can facilitate the realization of fully printed, magnetically controlled, tunable devices. This report presents the development of an iron oxide nanoparticle-based magnetic ink. First, a tunable inductor is fully printed using iron oxide nanoparticle-based magnetic ink. Furthermore, iron oxide nanoparticles are functionalized with oleic acid to make them compatible with a UV-curable SU8 solution. Functionalized iron oxide nanoparticles are successfully embedded in the SU8 matrix to make a magnetic substrate. The as-fabricated substrate is characterized for its magnetostatic and microwave properties. A frequency tunable printed patch antenna is demonstrated using the magnetic and in-house silver-organo-complex inks. This is a step toward low-cost, fully printed, controllable electronic components.

  12. Producing a lycopene nanodispersion: Formulation development and the effects of high pressure homogenization.

    Science.gov (United States)

    Shariffa, Y N; Tan, T B; Uthumporn, U; Abas, F; Mirhosseini, H; Nehdi, I A; Wang, Y-H; Tan, C P

    2017-11-01

    The aim of this study was to develop formulations to produce lycopene nanodispersions and to investigate the effects of the homogenization pressure on the physicochemical properties of the lycopene nanodispersion. The samples were prepared by using emulsification-evaporation technique. The best formulation was achieved by dispersing an organic phase (0.3% w/v lycopene dissolved in dichloromethane) in an aqueous phase (0.3% w/v Tween 20 dissolved in deionized water) at a ratio of 1:9 by using homogenization process. The increased level of homogenization pressure to 500bar reduced the particle size and lycopene concentration significantly (phomogenization pressure (700-900bar) resulted in large particle sizes with high dispersibility. The zeta potential and turbidity of the lycopene nanodispersion were significantly influenced by the homogenization pressure. The results from this study provided useful information for producing small-sized lycopene nanodispersions with a narrow PDI and good stability for application in beverage products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  14. Temperature fluctuations in fully-developed turbulent channel flow with heated upper wall

    Science.gov (United States)

    Bahri, Carla; Mueller, Michael; Hultmark, Marcus

    2013-11-01

    The interactions and scaling differences between the velocity field and temperature field in a wall-bounded turbulent flow are investigated. In particular, a fully developed turbulent channel flow perturbed by a step change in the wall temperature is considered with a focus on the details of the developing thermal boundary layer. For this specific study, temperature acts as a passive scalar, having no dynamical effect on the flow. A combination of experimental investigation and direct numerical simulation (DNS) is presented. Velocity and temperature data are acquired with high accuracy where, the flow is allowed to reach a fully-developed state before encountering a heated upper wall at constant temperature. The experimental data is compared with DNS data where simulations of the same configuration are conducted.

  15. Flow Reversal of Fully-Developed Mixed MHD Convection in Vertical Channels

    International Nuclear Information System (INIS)

    Saleh, H.; Hashim, I.

    2010-01-01

    The present analysis is concerned with flow reversal phenomena of the fully-developed laminar combined free and forced MHD convection in a vertical parallel-plate channel. The effect of viscous dissipation is taken into account. Flow reversal adjacent to the cold (or hot) wall is found to exist within the channel as Gr/Re is above (or below) a threshold value. Parameter zones for the occurrence of reversed flow are presented. (fundamental areas of phenomenology(including applications))

  16. Computational Investigation on Fully Developed Periodic Laminar Flow Structure in Baffled Circular Tube with Various BR

    Directory of Open Access Journals (Sweden)

    Withada Jedsadaratanachai

    2014-01-01

    Full Text Available This paper presents a 3D numerical analysis of fully developed periodic laminar flow in a circular tube fitted with 45° inclined baffles with inline arrangement. The computations are based on a finite volume method, and the SIMPLE algorithm has been implemented. The characteristics of fluid flow are presented for Reynolds number, Re = 100–1000, based on the hydraulic diameter (D of the tube. The angled baffles were repeatedly inserted at the middle of the test tube with inline arrangement to generate vortex flows over the tested tube. Effects of different Reynolds numbers and blockage ratios (b/D, BR with a single pitch ratio of 1 on flow structure in the tested tube were emphasized. The flows in baffled tube show periodic flow at x/D ≈ 2-3, and become a fully developed periodic flow profiles at x/D ≈ 6-7, depending on Re, BR and transverse plane positions. The computational results reveal that the higher of BR and closer position of turbulators, the faster of fully developed periodic flow profiles.

  17. Investigation of the required length for fully developed pipe flow with drag-reducing polymer solutions

    Science.gov (United States)

    Farsiani, Yasaman; Elbing, Brian

    2015-11-01

    Adding trace amounts of long chain polymers into a liquid flow is known to reduce skin friction drag by up to 80%. While polymer drag reduction (PDR) has been successfully implemented in internal flows, diffusion and degradation have limited its external flow applications. A weakness in many previous PDR studies is that there was no characterization of the polymer being injected into the turbulent boundary layer, which can be accomplished by testing a sample in a pressure-drop tube. An implicit assumption in polymer characterization is that the flow is fully developed at the differential pressure measurement. While available data in the literature shows that the entry length to achieve fully developed flow increases with polymeric solutions, it is unclear how long is required to achieve fully developed flow for non-Newtonian turbulent flows. In the present study, the pressure-drop is measured across a 1.05 meter length section of a 1.04 cm inner diameter pipe. Differential pressure is measured with a pressure transducer for different entry lengths, flow and polymer solution properties. This presentation will present preliminary data on the required entrance length as well as characterization of polymer solution an estimate of the mean molecular weight.

  18. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  19. Request for Information from entities interested in commercializing Laboratory-developed homogeneous catalyst technology

    Energy Technology Data Exchange (ETDEWEB)

    Intrator, Miranda Huang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-25

    Many industrial catalysts used for homogeneous hydrogenation and dehydrogenation of unsaturated substrates are derived from metal complexes that include (air-sensitive) ligands that are often expensive and difficult to synthesize. In particular, catalysts used for many hydrogenations are based on phosphorus containing ligands (in particular PNP pincer systems). These ligands are often difficult to make, are costly, are constrained to having two carbon atoms in the ligand backbone and are susceptible to oxidation at phosphorus, making their use somewhat complicated. Los Alamos researchers have recently developed a new and novel set of ligands that are based on a NNS (ENENES) skeleton (i.e. no phosphorus donors, just nitrogen and sulfur).

  20. Developing a multi-physics solver in APOLLO3 and applications to cross section homogenization

    International Nuclear Information System (INIS)

    Dugan, Kevin-James

    2016-01-01

    Multi-physics coupling is becoming of large interest in the nuclear engineering and computational science fields. The ability to obtain accurate solutions to realistic models is important to the design and licensing of novel reactor designs, especially in design basis accident situations. The physical models involved in calculating accident behavior in nuclear reactors includes: neutron transport, thermal conduction/convection, thermo-mechanics in fuel and support structure, fuel stoichiometry, among others. However, this thesis focuses on the coupling between two models, neutron transport and thermal conduction/convection.The goal of this thesis is to develop a multi-physics solver for simulating accidents in nuclear reactors. The focus is both on the simulation environment and the data treatment used in such simulations.This work discusses the development of a multi-physics framework based around the Jacobian-Free Newton-Krylov (JFNK) method. The framework includes linear and nonlinear solvers, along with interfaces to existing numerical codes that solve neutron transport and thermal hydraulics models (APOLLO3 and MCTH respectively) through the computation of residuals. a new formulation for the neutron transport residual is explored, which reduces the solution size and search space by a large factor; instead of the residual being based on the angular flux, it is based on the fission source.The question of whether using a fundamental mode distribution of the neutron flux for cross section homogenization is sufficiently accurate during fast transients is also explored. It is shown that in an infinite homogeneous medium, using homogenized cross sections produced with a fundamental mode flux differ significantly from a reference solution. The error is remedied by using an alternative weighting flux taken from a time dependent calculation; either a time-integrated flux or an asymptotic solution. The time-integrated flux comes from the multi-physics solution of the

  1. Analysis of thermal dispersion in an array of parallel plates with fully-developed laminar flow

    International Nuclear Information System (INIS)

    Xu Jiaying; Lu Tianjian; Hodson, Howard P.; Fleck, Norman A.

    2010-01-01

    The effect of thermal dispersion upon heat transfer across a periodic array of parallel plates is studied. Three basic heat transfer problems are addressed, each for steady, fully-developed, laminar fluid flow: (a) transient heat transfer due to an arbitrary initial temperature distribution within the fluid, (b) steady heat transfer with constant heat flux on all plate surfaces, and (c) steady heat transfer with constant wall temperatures. For problems (a) and (b), the effective thermal dispersivity scales with the Peclet number Pe according to 1 + CPe 2 , where the coefficient C is independent of Pe. For problem (c) the coefficient C is a function of Pe.

  2. Fully developed laminar flow of two immiscible liquids through horizontal pipes: a variational approach

    Energy Technology Data Exchange (ETDEWEB)

    Kurban, Adib Paulo Abdalla [PETROBRAS, Rio de Janeiro (Brazil). Centro de Pesquisas; Bannwart, Antonio Carlos [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica

    1990-12-31

    The fully developed laminar flow of two immiscible liquids with both different viscosities and densities through a horizontal round pipe is studied. The interface between the fluids as well as their flow fields are determined by the use of a variational principle: the so called viscous dissipation principle: The results foreseen by this paper are in agreement with the physical observation (e.g. Southern and Ballman) that the more viscous fluid is total or partially encapsulated by the less viscous one. (author) 8 refs., 4 figs.

  3. The effect of ultrasound on arterial blood flow: 1. Steady fully developed flow

    International Nuclear Information System (INIS)

    Bestman, A.R.

    1990-12-01

    The paper models the effects of ultrasound heating of the tissues and the resultant perturbation on blood flow in the arteries and veins. It is assumed that the blood vessel is rigid and the undisturbed flow is fully developed. Acoustical perturbation on this Poiseuille flow, for the general three-dimensional flow with heat transfer in an infinitely long pipe is considered. Closed form analytical solutions are obtained to the problem. It is discovered that the effects of the ultrasound heating are concentrated at the walls of the blood vessels. (author). 4 refs

  4. Fully developed laminar flow of two immiscible liquids through horizontal pipes: a variational approach

    Energy Technology Data Exchange (ETDEWEB)

    Kurban, Adib Paulo Abdalla [PETROBRAS, Rio de Janeiro (Brazil). Centro de Pesquisas; Bannwart, Antonio Carlos [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica

    1991-12-31

    The fully developed laminar flow of two immiscible liquids with both different viscosities and densities through a horizontal round pipe is studied. The interface between the fluids as well as their flow fields are determined by the use of a variational principle: the so called viscous dissipation principle: The results foreseen by this paper are in agreement with the physical observation (e.g. Southern and Ballman) that the more viscous fluid is total or partially encapsulated by the less viscous one. (author) 8 refs., 4 figs.

  5. Development and application of a fully implicit fluid dynamics code for multiphase flow

    International Nuclear Information System (INIS)

    Morii, Tadashi; Ogawa, Yumi

    1996-01-01

    Multiphase flow frequently occurs in a progression of accidents of nuclear reactor severe core damage. The CHAMPAGNE code has been developed to analyze thermohydraulic behavior of multiphase and multicomponent fluid, which requires for its characterization more than one set of velocities, temperatures, masses per unit volume, and so forth at each location in the calculation domain. Calculations of multiphase flow often show physical and numerical instability. The effect of numerical stabilization obtained by the upwind differencing and the fully implicit techniques gives one a convergent solution more easily than other techniques. Several results calculated by the CHAMPAGNE code are explained

  6. Osborne Reynolds pipe flow: direct numerical simulation from laminar to fully-developed turbulence

    Science.gov (United States)

    Adrian, R. J.; Wu, X.; Moin, P.; Baltzer, J. R.

    2014-11-01

    Osborne Reynolds' pipe experiment marked the onset of modern viscous flow research, yet the detailed mechanism carrying the laminar state to fully-developed turbulence has been quite elusive, despite notable progress related to dynamic edge-state theory. Here, we continue our direct numerical simulation study on this problem using a 250R long, spatially-developing pipe configuration with various Reynolds numbers, inflow disturbances, and inlet base flow states. For the inlet base flow, both fully-developed laminar profile and the uniform plug profile are considered. Inlet disturbances consist of rings of turbulence of different width and radial location. In all the six cases examined so far, energy norms show exponential growth with axial distance until transition after an initial decay near the inlet. Skin-friction overshoots the Moody's correlation in most, but not all, the cases. Another common theme is that lambda vortices amplified out of susceptible elements in the inlet disturbances trigger rapidly growing hairpin packets at random locations and times, after which infant turbulent spots appear. Mature turbulent spots in the pipe transition are actually tight concentrations of hairpin packets looking like a hairpin forest. The plug flow inlet profile requires much stronger disturbances to transition than the parabolic profile.

  7. Development of Molecular Catalysts to Bridge the Gap between Heterogeneous and Homogeneous Catalysts

    Science.gov (United States)

    Ye, Rong

    Catalysts, heterogeneous, homogeneous, and enzymatic, are comprised of nanometer-sized inorganic and/or organic components. They share molecular factors including charge, coordination, interatomic distance, bonding, and orientation of catalytically active atoms. By controlling the governing catalytic components and molecular factors, catalytic processes of a multichannel and multiproduct nature could be run in all three catalytic platforms to create unique end-products. Unifying the fields of catalysis is the key to achieving the goal of 100% selectivity in catalysis. Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles - some without homogeneous analogues - for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimize and expand the scope of their reactivity and selectivity. Ongoing efforts are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence and structural uniformity, dendrimers have proven to

  8. A look-up table for fully developed film-boiling heat transfer

    International Nuclear Information System (INIS)

    Groeneveld, D.C.; Leung, L.K.H.; Vasic, A.Z.; Guo, Y.J.; Cheng, S.C.

    2003-01-01

    An improved look-up table for film-boiling heat-transfer coefficients has been derived for steam-water flow inside vertical tubes. Compared to earlier versions of the look-up table, the following improvements were made: - The database has been expanded significantly. The present database contains 77,234 film-boiling data points obtained from 36 sources. - The upper limit of the thermodynamic quality range was increased from 1.2 to 2.0. The wider range was needed as non-equilibrium effects at low flows can extend well beyond the point where the thermodynamic quality equals unity. - The surface heat flux has been replaced by the surface temperature as an independent parameter. - The new look-up table is based only on fully developed film-boiling data. - The table entries at flow conditions for which no data are available is based on the best of five different film-boiling prediction methods. The new film-boiling look-up table predicts the database for fully developed film-boiling data with an overall rms error in heat-transfer coefficient of 10.56% and an average error of 1.71%. A comparison of the prediction accuracy of the look-up table with other leading film-boiling prediction methods shows that the look-up table results in a significant improvement in prediction accuracy

  9. Non-homogeneous harmonic analysis: 16 years of development

    International Nuclear Information System (INIS)

    Volberg, A L; Èiderman, V Ya

    2013-01-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles

  10. Osborne Reynolds pipe flow: Direct simulation from laminar through gradual transition to fully developed turbulence.

    Science.gov (United States)

    Wu, Xiaohua; Moin, Parviz; Adrian, Ronald J; Baltzer, Jon R

    2015-06-30

    The precise dynamics of breakdown in pipe transition is a century-old unresolved problem in fluid mechanics. We demonstrate that the abruptness and mysteriousness attributed to the Osborne Reynolds pipe transition can be partially resolved with a spatially developing direct simulation that carries weakly but finitely perturbed laminar inflow through gradual rather than abrupt transition arriving at the fully developed turbulent state. Our results with this approach show during transition the energy norms of such inlet perturbations grow exponentially rather than algebraically with axial distance. When inlet disturbance is located in the core region, helical vortex filaments evolve into large-scale reverse hairpin vortices. The interaction of these reverse hairpins among themselves or with the near-wall flow when they descend to the surface from the core produces small-scale hairpin packets, which leads to breakdown. When inlet disturbance is near the wall, certain quasi-spanwise structure is stretched into a Lambda vortex, and develops into a large-scale hairpin vortex. Small-scale hairpin packets emerge near the tip region of the large-scale hairpin vortex, and subsequently grow into a turbulent spot, which is itself a local concentration of small-scale hairpin vortices. This vortex dynamics is broadly analogous to that in the boundary layer bypass transition and in the secondary instability and breakdown stage of natural transition, suggesting the possibility of a partial unification. Under parabolic base flow the friction factor overshoots Moody's correlation. Plug base flow requires stronger inlet disturbance for transition. Accuracy of the results is demonstrated by comparing with analytical solutions before breakdown, and with fully developed turbulence measurements after the completion of transition.

  11. Numerical analysis of liquid metal MHD flows through circular pipes based on a fully developed modeling

    International Nuclear Information System (INIS)

    Zhang, Xiujie; Pan, Chuanjie; Xu, Zengyu

    2013-01-01

    Highlights: ► 2D MHD code based on a fully developed modeling is developed and validated by Samad analytical results. ► The results of MHD effect of liquid metal through circular pipes at high Hartmann numbers are given. ► M type velocity profile is observed for MHD circular pipe flow at high wall conductance ratio condition. ► Non-uniform wall electrical conductivity leads to high jet velocity in Robert layers. -- Abstract: Magnetohydrodynamics (MHD) laminar flows through circular pipes are studied in this paper by numerical simulation under the conditions of Hartmann numbers from 18 to 10000. The code is developed based on a fully developed modeling and validated by Samad's analytical solution and Chang's asymptotic results. After the code validation, numerical simulation is extended to high Hartmann number for MHD circular pipe flows with conducting walls, and numerical results such as velocity distribution and MHD pressure gradient are obtained. Typical M-type velocity is observed but there is not such a big velocity jet as that of MHD rectangular duct flows even under the conditions of high Hartmann numbers and big wall conductance ratio. The over speed region in Robert layers becomes smaller when Hartmann numbers increase. When Hartmann number is fixed and wall conductance ratios change, the dimensionless velocity is through one point which is in agreement with Samad's results, the locus of maximum value of velocity jet is same and effects of wall conductance ratio only on the maximum value of velocity jet. In case of Robert walls are treated as insulating and Hartmann walls as conducting for circular pipe MHD flows, there is big velocity jet like as MHD rectangular duct flows of Hunt's case 2

  12. Development of a Fully-Automated Monte Carlo Burnup Code Monteburns

    International Nuclear Information System (INIS)

    Poston, D.I.; Trellue, H.R.

    1999-01-01

    Several computer codes have been developed to perform nuclear burnup calculations over the past few decades. In addition, because of advances in computer technology, it recently has become more desirable to use Monte Carlo techniques for such problems. Monte Carlo techniques generally offer two distinct advantages over discrete ordinate methods: (1) the use of continuous energy cross sections and (2) the ability to model detailed, complex, three-dimensional (3-D) geometries. These advantages allow more accurate burnup results to be obtained, provided that the user possesses the required computing power (which is required for discrete ordinate methods as well). Several linkage codes have been written that combine a Monte Carlo N-particle transport code (such as MCNP TM ) with a radioactive decay and burnup code. This paper describes one such code that was written at Los Alamos National Laboratory: monteburns. Monteburns links MCNP with the isotope generation and depletion code ORIGEN2. The basis for the development of monteburns was the need for a fully automated code that could perform accurate burnup (and other) calculations for any 3-D system (accelerator-driven or a full reactor core). Before the initial development of monteburns, a list of desired attributes was made and is given below. o The code should be fully automated (that is, after the input is set up, no further user interaction is required). . The code should allow for the irradiation of several materials concurrently (each material is evaluated collectively in MCNP and burned separately in 0RIGEN2). o The code should allow the transfer of materials (shuffling) between regions in MCNP. . The code should allow any materials to be added or removed before, during, or after each step in an automated fashion. . The code should not require the user to provide input for 0RIGEN2 and should have minimal MCNP input file requirements (other than a working MCNP deck). . The code should be relatively easy to use

  13. Investigation of dissipation elements in a fully developed turbulent channel flow by tomographic particle-image velocimetry

    Science.gov (United States)

    Schäfer, L.; Dierksheide, U.; Klaas, M.; Schröder, W.

    2011-03-01

    A new method to describe statistical information from passive scalar fields has been proposed by Wang and Peters ["The length-scale distribution function of the distance between extremal points in passive scalar turbulence," J. Fluid Mech. 554, 457 (2006)]. They used direct numerical simulations (DNS) of homogeneous shear flow to introduce the innovative concept. This novel method determines the local minimum and maximum points of the fluctuating scalar field via gradient trajectories, starting from every grid point in the direction of the steepest ascending and descending scalar gradients. Relying on gradient trajectories, a dissipation element is defined as the region of all the grid points, the trajectories of which share the same pair of maximum and minimum points. The procedure has also been successfully applied to various DNS fields of homogeneous shear turbulence using the three velocity components and the kinetic energy as scalar fields [L. Wang and N. Peters, "Length-scale distribution functions and conditional means for various fields in turbulence," J. Fluid Mech. 608, 113 (2008)]. In this spirit, dissipation elements are, for the first time, determined from experimental data of a fully developed turbulent channel flow. The dissipation elements are deduced from the gradients of the instantaneous fluctuation of the three velocity components u', v', and w' and the instantaneous kinetic energy k', respectively. The measurements are conducted at a Reynolds number of 1.7×104 based on the channel half-height δ and the bulk velocity U. The required three-dimensional velocity data are obtained investigating a 17.75×17.75×6 mm3 (0.355δ×0.355δ×0.12δ) test volume using tomographic particle-image velocimetry. Detection and analysis of dissipation elements from the experimental velocity data are discussed in detail. The statistical results are compared to the DNS data from Wang and Peters ["The length-scale distribution function of the distance between

  14. Development of a Rapid Insulin Assay by Homogenous Time-Resolved Fluorescence.

    Directory of Open Access Journals (Sweden)

    Zachary J Farino

    Full Text Available Direct measurement of insulin is critical for basic and clinical studies of insulin secretion. However, current methods are expensive and time-consuming. We developed an insulin assay based on homogenous time-resolved fluorescence that is significantly more rapid and cost-effective than current commonly used approaches. This assay was applied effectively to an insulin secreting cell line, INS-1E cells, as well as pancreatic islets, allowing us to validate the assay by elucidating mechanisms by which dopamine regulates insulin release. We found that dopamine functioned as a significant negative modulator of glucose-stimulated insulin secretion. Further, we showed that bromocriptine, a known dopamine D2/D3 receptor agonist and newly approved drug used for treatment of type II diabetes mellitus, also decreased glucose-stimulated insulin secretion in islets to levels comparable to those caused by dopamine treatment.

  15. Development of Fully Automated Low-Cost Immunoassay System for Research Applications.

    Science.gov (United States)

    Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien

    2017-10-01

    Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.

  16. DNS of fully developed turbulent heat transfer of a viscoelastic drag-reducing flow

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Bo [Department of Oil and Gas Storage and Transportation Engineering, China University of Petroleum, Beijing 102249 (China); Kawaguchi, Yasuo [Department of Mechanical Engineering, Faculty of Science and Technology, Tokyo University of Science, 2641 Yamazaki, Noda, Chiba 278-8510 (Japan)

    2005-10-01

    A direct numerical simulation (DNS) of turbulent heat transfer in a channel flow with a Giesekus model was carried out to investigate turbulent heat transfer mechanism of a viscoelastic drag-reducing flow by additives. The configuration was a fully-developed turbulent channel flow with uniform heat flux imposed on both the walls. The temperature was considered as a passive scalar with the effect of buoyancy force neglected. The Reynolds number based on the friction velocity and half the channel height was 150. Statistical quantities such as root-mean-square temperature fluctuations, turbulent heat fluxes and turbulent Prandtl number were obtained and compared with those of a Newtonian fluid flow. Budget terms of the temperature variance and turbulent heat fluxes were also presented. (author)

  17. Development of absolute intensity monitors and homogeneous irradiation system for synchrotron radiation at SPring-8

    International Nuclear Information System (INIS)

    Nariyama, N.; Taniguchi, S.; Kishi, N.; Ohnishi, S.; Odano, N.

    2003-01-01

    At SPring-8 of 8-GeV synchrotron radiation facility in Japan, high-intense monoenergetic photon beam up to hundred keV are available. For biological and medical researches, the absolute intensity monitor and homogeneous irradiation system are necessary. In this study, as intensity monitor two sizes of parallel-plate free-air ionization chambers were developed: plate distance of 8.5 cm for high-energy photons up to 190 keV and that of 0.42 cm for low-energy high-intense photons from undulator. Sizes of the larger chamber were determined considering electron loss using a Monte Carlo electron-photon transport code EGS4. For homogeneous irradiation method, a rotational linear-scanning method was proposed. An experiment was carried out at bending-magnet beamlines BL20B2 and 38B1 for high-energy photons. Dose measured with the chamber was compared to that of another free-air ionization chamber with 5-cm plate separation and a Si-PIN photodiode, which have been calibrated with a calorimeter. Agreement was confirmed from 30 keV up to 150 keV with a discrepancy less than 3%, which were independent of the beam size, intensity and injection position of the chamber. For intense photons from undulator, saturation measurement was made using the smaller chamber at BL46XU. Saturation was confirmed at 3 kV for 15- and 20-keV photons up to 10 12 photons/s. Using the larger chamber, fixed and Xθ -scanning irradiation methods were compared by irradiating thermoluminescent dosimeters at BL20B2, which can provide broad beam. As a function of the estimated exposure, all the signal data points became on the same straight line. Consequently, the Xθ -scanning method was found to be effective

  18. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  19. Multi-scale viscosity model of turbulence for fully-developed channel flows

    International Nuclear Information System (INIS)

    Kriventsev, V.; Yamaguchi, A.; Ninokata, H.

    2001-01-01

    The full text follows. Multi-Scale Viscosity (MSV) model is proposed for estimation of the Reynolds stresses in turbulent fully-developed flow in a straight channel of an arbitrary shape. We assume that flow in an ''ideal'' channel is always stable, i.e. laminar, but turbulence is developing process of external perturbations cased by wall roughness and other factors. We also assume that real flows are always affected by perturbations of every scale lower than the size of the channel. And the turbulence is generated in form of internal, or ''turbulent'' viscosity increase to preserve stability of ''disturbed'' flow. The main idea of MSV can be expressed in the following phenomenological rule: A local deformation of axial velocity can generate the turbulence with the intensity that keeps the value of local turbulent Reynolds number below some critical value. Here, the local turbulent Reynolds number is defined as a product of value of axial velocity deformation for a given scale and generic length of this scale divided by accumulated value of laminar and turbulent viscosity of lower scales. In MSV, the only empirical parameter is the critical Reynolds number that is estimated to be around 100. It corresponds for the largest scale which is hydraulic diameter of the channel and, therefore represents the regular Reynolds number. Thus, the value Re=100 corresponds to conditions when turbulent flow can appear in case of ''significant'' (comparable with size of channel) velocity disturbance in boundary and/or initial conditions for velocity. Of course, most of real flows in channels with relatively smooth walls remain laminar for this small Reynolds number because of absence of such ''significant'' perturbations. MSV model has been applied to the fully-developed turbulent flows in straight channels such as a circular tube and annular channel. Friction factor and velocity profiles predicted with MSV are in a very good agreement with numerous experimental data. Position of

  20. Development of a Computer Program for the Integrated Control of the Fuel Homogeneity Measurement System

    International Nuclear Information System (INIS)

    Shin, H. S.; Jang, J. W.; Lee, Y. H.; Oh, S. J.; Park, H. D.; Kim, C. K.

    2005-11-01

    The computer program is developed based on Visual C++, which is equipped with a user-friendly interface of the input/output(I/O) and a display function for the measuring conditions. This program consists of three parts which are the port communication, PLC(Programmable Logic Controller) and the MCA(Multi Channel Analyzer) control parts. The communication type between the CPU of the PLC module box and the computer is selected as be the Rs-232 asynchronous type and the thread method is adapted in the development of the first part of the program. The PLC-related program has been developed so that the data communication between the PLC CPU and the computer could be harmonized with the unique commands which have already been defined in the PLC. The measuring space and time intervals, the start and end ROI(region of interest) values, and the allowable error limitation are input at each measurement in this program. Finally the controlling MCA program has been developed by using Canberra's programming library which contains several files including the head files in which the variable and the function of C++ are declared according to the MCA function. The performance test has been carried out through an application of the developed computer program to the homogeneity measurement system. The gamma counts at 28 measuring points of a fuel rod of 700 mm in length are measured for 50 sec at each point. It was revealed that the measurement results are better than the previous ones in respects of the measurement accuracy and a measurement time saving could be achieved. It was concluded that the gamma measurement system can be improved through equipping it with the developed control program

  1. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    Science.gov (United States)

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  2. Development of a fully automated adaptive unsharp masking technique in digital chest radiograph

    International Nuclear Information System (INIS)

    Abe, Katsumi; Katsuragawa, Shigehiko; Sasaki, Yasuo

    1991-01-01

    We are developing a fully automated adaptive unsharp masking technique with various parameters depending on regional image features of a digital chest radiograph. A chest radiograph includes various regions such as lung fields, retrocardiac area and spine in which their texture patterns and optical densities are extremely different. Therefore, it is necessary to enhance image contrast of each region by each optimum parameter. First, we investigated optimum weighting factors and mask sizes of unsharp masking technique in a digital chest radiograph. Then, a chest radiograph is automatically divided into three segments, one for the lung field, one for the retrocardiac area, and one for the spine, by using histogram analysis of pixel values. Finally, high frequency components of the lung field and retrocardiac area are selectively enhanced with a small mask size and mild weighting factors which are previously determined as optimum parameters. In addition, low frequency components of the spine are enhanced with a large mask size and adequate weighting factors. This processed image shows excellent depiction of the lung field, retrocardiac area and spine simultaneously with optimum contrast. Our image processing technique may be useful for diagnosis of chest radiographs. (author)

  3. Lattice Boltzmann simulations of heat transfer in fully developed periodic incompressible flows

    Science.gov (United States)

    Wang, Zimeng; Shang, Helen; Zhang, Junfeng

    2017-06-01

    Flow and heat transfer in periodic structures are of great interest for many applications. In this paper, we carefully examine the periodic features of fully developed periodic incompressible thermal flows, and incorporate them in the lattice Boltzmann method (LBM) for flow and heat transfer simulations. Two numerical approaches, the distribution modification (DM) approach and the source term (ST) approach, are proposed; and they can both be used for periodic thermal flows with constant wall temperature (CWT) and surface heat flux boundary conditions. However, the DM approach might be more efficient, especially for CWT systems since the ST approach requires calculations of the streamwise temperature gradient at all lattice nodes. Several example simulations are conducted, including flows through flat and wavy channels and flows through a square array with circular cylinders. Results are compared to analytical solutions, previous studies, and our own LBM calculations using different simulation techniques (i.e., the one-module simulation vs. the two-module simulation, and the DM approach vs. the ST approach) with good agreement. These simple, however, representative simulations demonstrate the accuracy and usefulness of our proposed LBM methods for future thermal periodic flow simulations.

  4. Statistical symmetry restoration in fully developed turbulence: Renormalization group analysis of two models

    Science.gov (United States)

    Antonov, N. V.; Gulitskiy, N. M.; Kostenko, M. M.; Malyshev, A. V.

    2018-03-01

    In this paper we consider the model of incompressible fluid described by the stochastic Navier-Stokes equation with finite correlation time of a random force. Inertial-range asymptotic behavior of fully developed turbulence is studied by means of the field theoretic renormalization group within the one-loop approximation. It is corroborated that regardless of the values of model parameters and initial data the inertial-range behavior of the model is described by the limiting case of vanishing correlation time. This indicates that the Galilean symmetry of the model violated by the "colored" random force is restored in the inertial range. This regime corresponds to the only nontrivial fixed point of the renormalization group equation. The stability of this point depends on the relation between the exponents in the energy spectrum E ∝k1 -y and the dispersion law ω ∝k2 -η . The second analyzed problem is the passive advection of a scalar field by this velocity ensemble. Correlation functions of the scalar field exhibit anomalous scaling behavior in the inertial-convective range. We demonstrate that in accordance with Kolmogorov's hypothesis of the local symmetry restoration the main contribution to the operator product expansion is given by the isotropic operator, while anisotropic terms should be considered only as corrections.

  5. Fully developed liquid-metal flow in multiple rectangular ducts in a strong uniform magnetic field

    International Nuclear Information System (INIS)

    Molokov, S.

    1993-01-01

    Fully developed liquid-metal flow in a straight rectangular duct with thin conducting walls is investigated. The duct is divided into a number of rectangular channels by electrically conducting dividing walls. A strong uniform magnetic field is applied parallel to the outer side walls and dividing walls and perpendicular to the top and the bottom walls. The analysis of the flow is performed by means of matched asymptotics at large values of the Hartmann number M. The asymptotic solution obtained is valid for arbitrary wall conductance ratio of the side walls and dividing walls, provided the top and bottom walls are much better conductors than the Hartmann layers. The influence of the Hartmann number, wall conductance ratio, number of channels and duct geometry on pressure losses and flow distribution is investigated. If the Hartmann number is high, the volume flux is carried by the core, occupying the bulk of the fluid and by thin layers with thickness of order M -1/2 . In some of the layers, however, the flow is reversed. As the number of channels increases the flow in the channels close to the centre approaches a Hartmann-type flow with no jets at the side walls. Estimation of pressure-drop increase in radial ducts of a self-cooled liquid-metal blanket with respect to flow in a single duct with walls of the same wall conductance ratio gives an upper limit of 30%. (author). 13 refs., 10 figs., 1 tab

  6. Effect of free-air nuclei on fully developed individual bubble cavitation

    International Nuclear Information System (INIS)

    Danel, F.; Lecoffre, Y.

    1976-01-01

    Fully developed individual-bubble cavitation was studied. Nuclei population and pressure distribution at the boundary of a cavitating converging-diverging test section were measured. It was shown that some cavitation tests can only yield valid results if the free air content of the water is known. During the initial stages of bubble growth the wall pressure in the cavitation region is lower than the vapor pressure. Wall pressure rises later. For a given cavitation number and flow velocity, the pressure distribution depends on the number of expanding bubbles on the hydrofoil. Minimum pressure coefficient depends only on the cavitation number, the flow velocity and the number of expanding bubbles present. Bubbles generate pressure pulses at the wall; combined effect of all such pulses is to shift the wall pressure away from the value that would be obtained at the same cavitation number if no cavitation was present. The greater the number of expanding bubbles, the more the wall pressure tends to approach the vapor pressure. An important result of the work is to pin-point free air contents of water tunnel which lead to correct scaling of cavitation flows [fr

  7. The development of a fully automated radioimmunoassay instrument - micromedic systems concept 4

    International Nuclear Information System (INIS)

    Painter, K.

    1977-01-01

    The fully automatic RIA system Concept 4 by Micromedic is described in detail. The system uses antibody-coated test tubes to take up the samples. It has a maximum capacity of 200 tubes including standards and control tubes. Its advantages are, in particular, high flow rate, reproducibility, and fully automatic testing i.e. low personnel requirements. Its disadvantage are difficulties in protein assays. (ORU) [de

  8. Ethnic-homogenization processes in the most developed region of Serbia, the multicultural Vojvodina

    Directory of Open Access Journals (Sweden)

    Zsuzsa M. Császár

    2012-02-01

    created in 1st December 1918, the power of Belgrade started to settle Serbs from the underdeveloped southern parts of Serbia. The homes of fleeing Hungarians and Germans mostly in Banat in the last month of World War II had been taken up Serbs derived from Lika, Bosnia, Montenegro and Kosovo. In this time shaped the absolute majority of Serbs. Several migrants moved to Vojvodina considered to political changes from the 1990’s until today. Decreasing number of Hungarians got into minority in several settlements due to demograpic cases, emigration and the immigration of Serbs. In this study we analyse fully the changes of the ethnical spatial structure, particularly focusing on the expectancies and the questions of minority existence of Hungarians.Keywords: ethnic-homogenization, Autonomous Province of Vojvodina, Hungarians outside the borders of Hungary, multiethnical and multiconfessional region.

  9. Development and evaluation of fully automated demand response in large facilities

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing

  10. Large Eddy Simulation of Turbulence Modification and Particle Dispersion in a Fully-Developed Pipe Flow

    Science.gov (United States)

    Rani, Sarma; Pratap Vanka, Surya

    1999-11-01

    A LES study of the modification of turbulence in a fully-developed turbulent pipe flow by dispersed heavy particles at Re_τ = 360 is presented. A 64 (radial) x 64 (azimuthal) x 128 (axial) grid has been used. An Eulerian-Lagrangian approach has been used for treating the continuous and the dispersed phases respectively. The particle equation of motion included only the drag force. Three different LES models are used in the continuous fluid simulation: (i) A “No-Model” LES (coarse-grid DNS) (ii) Smagorinsky’s model and (iii) Schumann’s model . The motivation behind employing the Schumann’s model is to study the impact of sub-grid-scale fluctuations on the particle motion and their (SGS fluctuations) modulation, in turn, by the particles. The effect of particles on fluid turbulence is investigated by tracking 100000 particles of different diameters. Our studies confirm the preferential concentration of particles in the near wall region. It is observed that the inclusion of two-way coupling reduces the preferential concentration of particles. In addition, it was found that two-way coupling attenuates the fluid turbulence. However, we expect the above trends to differ depending upon the particle diameter, volumetric and mass fractions. The effect of SGS fluctuations on the particle dispersion and turbulence modulation is also being investigated. Other relevant statistics for the continuous and the dispersed phases are collected for the cases of one-way and two-way coupling. These statistics are compared to study the modulation of turbulence by the particles.

  11. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  12. Analysis of the two-fluid model in fully-developed two-phase flow

    International Nuclear Information System (INIS)

    Azpitarte, Osvaldo Enrique

    2003-01-01

    The two fluid model is analysed and applied to solve vertical fully-developed bubbly two-phase flows, both in laminar and turbulent conditions.The laminar model is reduced to two differential equations to solve the gas fraction (ε G ) and the velocity (υ L ).For the turbulent condition, a k - ε model for low Reynolds number is implemented, resulting in a set of differential equations to solve the four variables (ε G , υ L , k and ε) along the whole radial domain (including the laminar sub layer).For laminar condition, the system is initially reduced to a single non-dimensional ordinary equation (O D E) to solve ε G in the central region of the duct, without considering the effect of the wall.The equation is solved using Mathematic a.Analysing the solutions it can be concluded that an exact compensation of the applied pressure gradient with the hydrostatic force ρ e ff g occurs (ρ e ff : effective density of the mixture).This compensation implies that the value of ε G at the center of the duct only depends on the applied pressure gradient (dependency is linear), and that the ε G and υ L profiles are necessarily fl ato The complete problem is dealt numerically through the implementation of a finite element co deo The effect of the walls is included via a model of wall force.When the code is applied to a laminar condition, the conclusions previously obtained solving the O D E are confirmed.It is also possible to analyse the regime in which the pressure gradient is greater than the weight of the pure liquid, in which case a region of strictly zero void fraction develops surrounding the axis of the duct (in upward flow).When the code is applied to a turbulent condition, it is shown that the conclusions obtained for laminar condition can also be applied, but within a range of pressure gradient limited by two transition values (θ 1 and θ 2 ).An analysis of transitions θ 1 and θ 2 allows u s to conclude that their origin is a sudden increase of lateral

  13. Development of a Practical Hydrogen Storage System Based on Liquid Organic Hydrogen Carriers and a Homogeneous Catalyst

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Craig [Hawaii Hydrogen Carriers, LLC, Honolulu, HI (United States); Brayton, Daniel [Hawaii Hydrogen Carriers, LLC, Honolulu, HI (United States); Jorgensen, Scott W. [General Motors, LLC, Warren, MI (United States). Research and Development Center. Chemical and Material Systems Lab.; Hou, Peter [General Motors, LLC, Warren, MI (United States). Research and Development Center. Chemical and Material Systems Lab.

    2017-03-24

    The objectives of this project were: 1) optimize a hydrogen storage media based on LOC/homogeneous pincer catalyst (carried out at Hawaii Hydrogen Carriers, LLC) and 2) develop space, mass and energy efficient tank and reactor system to house and release hydrogen from the media (carried out at General Motor Research Center).

  14. Developing Fully Online Pre-Service Music and Arts Education Courses

    Science.gov (United States)

    Lierse, Sharon

    2015-01-01

    Charles Darwin University (CDU) offers education courses for students who want to teach in Australian schools. The university is unique due to its geographic location, proximity to Asia and its high Indigenous population compared to the rest of the country. Many courses are offered fully online including music education for pre-service teachers.…

  15. A Fully Developed Flow Thermofluid Model for Topology Optimization of 3D-Printed Air-Cooled Heat Exchangers

    DEFF Research Database (Denmark)

    Haertel, Jan Hendrik Klaas; Nellis, Gregory F.

    2017-01-01

    In this work, density-based topology optimization is applied to the design of the air-side surface of dry-cooled power plant condensers. A topology optimization model assuming a steady-state, thermally and fluid dynamically fully developed internal flow is developed and used for this application....

  16. Developing an automated water emitting-sensing system, based on integral tensiometers placed in homogenous environment.

    Science.gov (United States)

    Dabach, Sharon; Shani, Uri

    2010-05-01

    As the population grows, irrigated agriculture is using more water and fertilizers to supply the growing food demand. However, the uptake by various plants is only 30 to 50% of the water applied. The remaining water flows to surface water and groundwater and causes their contamination by fertilizers or other toxins such as herbicides or pesticides. To improve the water use efficiency of crops and decrease the drainage below the root zone, irrigation water should be applied according to the plant demand. The aim of this work is to develop an automated irrigation system based on real-time feedback from an inexpensive and reliable integrated sensing system. This system will supply water to plants according to their demand, without any user interference during the entire growth season. To achieve this goal a sensor (Geo-Tensiometer) was designed and tested. This sensor has better contact with the surrounding soil, is more reliable and much cheaper than the ceramic cup tensiometer. A lysimeter experiment was conducted to evaluate a subsurface drip irrigation regime based on the Geo-Tensiometer and compare it to a daily irrigation regime. All of the drippers were wrapped in Geo-textile. By integrating the Geo-Tensiometer within the Geo-textile which surrounds the drippers, we created a homogenous media in the entire lysimeter in which the reading of the matric potential takes place. This media, the properties of which are set and known to us, encourages root growth therein. Root density in this media is very high; therefore most of the plant water uptake is from this area. The irrigation system in treatment A irrigated when the matric potential reached a threshold which was set every morning automatically by the system. The daily treatment included a single irrigation each morning that was set to return 120% of the evapotranspiration of the previous day. All Geo-Tensiometers were connected to an automated washing system, that flushed air trapped in the Geo

  17. Development of triple scale finite element analyses based on crystallographic homogenization methods

    International Nuclear Information System (INIS)

    Nakamachi, Eiji

    2004-01-01

    Crystallographic homogenization procedure is implemented in the piezoelectric and elastic-crystalline plastic finite element (FE) code to assess its macro-continuum properties of piezoelectric ceramics and BCC and FCC sheet metals. Triple scale hierarchical structure consists of an atom cluster, a crystal aggregation and a macro- continuum. In this paper, we focus to discuss a triple scale numerical analysis for piezoelectric material, and apply to assess a macro-continuum material property. At first, we calculate material properties of Perovskite crystal of piezoelectric material, XYO3 (such as BaTiO3 and PbTiO3) by employing ab-initio molecular analysis code CASTEP. Next, measured results of SEM and EBSD observations of crystal orientation distributions, shapes and boundaries of a real material (BaTiO3) are employed to define an inhomogeneity of crystal aggregation, which corresponds to a unit cell of micro-structure, and satisfies the periodicity condition. This procedure is featured as a first scaling up from the molecular to the crystal aggregation. Finally, the conventional homogenization procedure is implemented in FE code to evaluate a macro-continuum property. This final procedure is featured as a second scaling up from the crystal aggregation (unit cell) to macro-continuum. This triple scale analysis is applied to design piezoelectric ceramic and finds an optimum crystal orientation distribution, in which a macroscopic piezoelectric constant d33 has a maximum value

  18. Development of fully dense and high performance powder metallurgy HSLA steel using HIP method

    Science.gov (United States)

    Liu, Wensheng; Pang, Xinkuan; Ma, Yunzhu; Cai, Qingshan; Zhu, Wentan; Liang, Chaoping

    2018-05-01

    In order to solve the problem that the mechanical properties of powder metallurgy (P/M) steels are much lower than those of traditional cast steels with the same composition due to their porosity, a high–strength–low–alloy (HSLA) steel with fully dense and excellent mechanical properties was fabricated through hot isostatic pressing (HIP) using gas–atomized powders. The granular structure in the P/M HIPed steel composed of bainitic ferrite and martensite–austenite (M–A) islands is obtained without the need of any rapid cooling. The P/M HIPed steel exhibit a combination of tensile strength and ductility that surpasses that of conventional cast steel and P/M sintered steel, confirming the feasibility of fabricating high performance P/M steel through appropriate microstructural control and manufacture process.

  19. Reflector homogenization

    International Nuclear Information System (INIS)

    Sanchez, R.; Ragusa, J.; Santandrea, S.

    2004-01-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P 0 transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP N core calculations. (Author)

  20. Reflector homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr

    2004-07-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)

  1. Methods for assessing homogeneity in ThO2--UO2 fuels (LWBR Development Program)

    International Nuclear Information System (INIS)

    Berman, R.M.

    1978-06-01

    ThO 2 -UO 2 solid solutions fabricated as LWBR fuel pellets are examined for uniform uranium distribution by means of autoradiography. Kodak NTA plates are used. Images of inhomogeneities are 29 +- 10 microns larger in diameter than the high-urania segregations that caused them, due to the range of alpha particles in the emulsion, and an appropriate correction must be made. Photographic density is approximately linear with urania content in the region between underexposure and overexposure, but the slope of the calibration curve varies with aging and growth of alpha activity from the parasitic 232 U and its decomposition products. A calibration must therefore be performed using two known points--the average photographic density (corresponding to the average composition) and an extrapolated background (corresponding to zero urania). As part of production pellet inspection, plates are evaluated by inspectors, who count segregations by size classes. This is supplemented by microdensitometer scans of the autoradiograph and by electron probe studies of the original sample if apparent homogeneity is marginal

  2. Development and Verification of a Fully Coupled Simulator for Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J. M.; Buhl, M. L. Jr.

    2007-01-01

    This report outlines the development of an analysis tool capable of analyzing a variety of wind turbine, support platform, and mooring system configurations.The simulation capability was tested by model-to-model comparisons to ensure its correctness.

  3. Numerical solution of fully developed heat transfer problem with constant wall temperature and application to isosceles triangle and parabolic ducts

    International Nuclear Information System (INIS)

    Karabulut, Halit; Ipci, Duygu; Cinar, Can

    2016-01-01

    Highlights: • A numerical method has been developed for fully developed flows with constant wall temperature. • The governing equations were transformed to boundary fitted coordinates. • The Nusselt number of parabolic duct has been investigated. • Validation of the numerical method has been made by comparing published data. - Abstract: In motor-vehicles the use of more compact radiators have several advantages such as; improving the aerodynamic form of cars, reducing the weight and volume of the cars, reducing the material consumption and environmental pollutions, and enabling faster increase of the engine coolant temperature after starting to run and thereby improving the thermal efficiency. For the design of efficient and compact radiators, the robust determination of the heat transfer coefficient becomes imperative. In this study the external heat transfer coefficient of the radiator has been investigated for hydrodynamically and thermally fully developed flows in channels with constant wall temperature. In such situation the numerical treatment of the problem results in a trivial solution. To find a non-trivial solution the problem is treated either as an eigenvalue problem or as a thermally developing flow problem. In this study a numerical solution procedure has been developed and the heat transfer coefficients of the fully developed flow in triangular and parabolic air channels were investigated. The governing equations were transformed to boundary fitted coordinates and numerically solved. The non-trivial solution was obtained by means of guessing the temperature of any grid point within the solution domain. The correction of the guessed temperature was performed via smoothing the temperature profile on a line passing through the mentioned grid point. Results were compared with literature data and found to be consistent.

  4. Development of a homogeneous pulse shape discriminating flow-cell radiation detection system

    International Nuclear Information System (INIS)

    Hastie, K.H.; DeVol, T.A.; Fjeld, R.A.

    1999-01-01

    A homogeneous flow-cell radiation detection system which utilizes coincidence counting and pulse shape discrimination circuitry was assembled and tested with five commercially available liquid scintillation cocktails. Two of the cocktails, Ultima Flo (Packard) and Mono Flow 5 (National Diagnostics) have low viscosities and are intended for flow applications; and three of the cocktails, Optiphase HiSafe 3 (Wallac), Ultima Gold AB (Packard), and Ready Safe (Beckman), have higher viscosities and are intended for static applications. The low viscosity cocktails were modified with 1-methylnaphthalene to increase their capability for alpha/beta pulse shape discrimination. The sample loading and pulse shape discriminator setting were optimized to give the lowest minimum detectable concentration for methylnaphthalenein a 30 s count time. Of the higher viscosity cocktails, Optiphase HiSafe 3 had the lowest minimum detectable activities for alpha and beta radiation, 0.2 and 0.4 Bq/ml for 233 U and 90 Sr/ 90 Y, respectively, for a 30 s count time. The sample loading was 70% and the corresponding alpha/beta spillover was 5.5%. Of the low viscosity cocktails, Mono Flow 5 modified with 2.5% (by volume) 1-methylnaphthalene resulted in the lowest minimum detectable activities for alpha and beta radiation; 0.3 and 0.5 Bq/ml for 233 U and 90 Sr/ 90 Y, respectively, for a 30 s count time. The sample loading was 50%, and the corresponding alpha/beta spillover was 16.6%. HiSafe 3 at a 10% sample loading was used to evaluate the system under simulated flow conditions

  5. Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization

    Directory of Open Access Journals (Sweden)

    McDonald Karen

    2011-08-01

    Full Text Available Abstract Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.

  6. Development of CMOS Pixel Sensors fully adapted to the ILD Vertex Detector Requirements

    CERN Document Server

    Winter, Marc; Besson, Auguste; Claus, Gilles; Dorokhov, Andrei; Goffe, Mathieu; Hu-Guo, Christine; Morel, Frederic; Valin, Isabelle; Voutsinas, Georgios; Zhang, Liang

    2012-01-01

    CMOS Pixel Sensors are making steady progress towards the specifications of the ILD vertex detector. Recent developments are summarised, which show that these devices are close to comply with all major requirements, in particular the read-out speed needed to cope with the beam related background. This achievement is grounded on the double- sided ladder concept, which allows combining signals generated by a single particle in two different sensors, one devoted to spatial resolution and the other to time stamp, both assembled on the same mechanical support. The status of the development is overviewed as well as the plans to finalise it using an advanced CMOS process.

  7. Numerical investigation of fully-developed magneto-hydro-dynamic flows in ducts

    International Nuclear Information System (INIS)

    Dajeh, D. A.

    1996-01-01

    In this paper a numerical study is presented for fuly developed magnetic-hydrodynamic flows in ducts under a uniform transverse implied magnetic field. Afinite different scheme comprising of modified ADI 'Alternating Direction Implicit' method and a SOUR 'Sucessive-Over and under relaxation' method are used to solve the set of governing equations. Computations are carried out for a different shape of ducts over a wide range of Hartman number, up to five thousands, which is an important parameter in the nuclear fusion reactor design. (author).16 refs., 7 figs., 3 tabs

  8. Evaluation and development the routing protocol of a fully functional simulation environment for VANETs

    Science.gov (United States)

    Ali, Azhar Tareq; Warip, Mohd Nazri Mohd; Yaakob, Naimah; Abduljabbar, Waleed Khalid; Atta, Abdu Mohammed Ali

    2017-11-01

    Vehicular Ad-hoc Networks (VANETs) is an area of wireless technologies that is attracting a great deal of interest. There are still several areas of VANETS, such as security and routing protocols, medium access control, that lack large amounts of research. There is also a lack of freely available simulators that can quickly and accurately simulate VANETs. The main goal of this paper is to develop a freely available VANETS simulator and to evaluate popular mobile ad-hoc network routing protocols in several VANETS scenarios. The VANETS simulator consisted of a network simulator, traffic (mobility simulator) and used a client-server application to keep the two simulators in sync. The VANETS simulator also models buildings to create a more realistic wireless network environment. Ad-Hoc Distance Vector routing (AODV), Dynamic Source Routing (DSR) and Dynamic MANET On-demand (DYMO) were initially simulated in a city, country, and highway environment to provide an overall evaluation.

  9. Altered regional homogeneity in the development of minimal hepatic encephalopathy: a resting-state functional MRI study.

    Directory of Open Access Journals (Sweden)

    Ling Ni

    Full Text Available BACKGROUND: Little is known about how spontaneous brain activity progresses from non-hepatic encephalopathy (non-HE to minimal HE (MHE. The purpose of this study was to evaluate the evolution pattern of spontaneous brain activities in cirrhotic patients using resting-state fMRI with a regional homogeneity (ReHo method. METHODOLOGY/PRINCIPAL FINDINGS: Resting-state fMRI data were acquired in 47 cirrhotic patients (minimal HE [MHE], n = 20, and non-HE, n = 27 and 25 age-and sex-matched healthy controls. The Kendall's coefficient of concordance (KCC was used to measure the regional homogeneity. The regional homogeneity maps were compared with ANOVA tests among MHE, non-HE, and healthy control groups and t-tests between each pair in a voxel-wise way. Correlation analyses were performed to explore the relationships between regional ReHo values and Child-Pugh scores, number connection test type A (NCT-A, digit symbol test (DST scores, venous blood ammonia levels. Compared with healthy controls, both MHE and non-HE patients showed decreased ReHo in the bilateral frontal, parietal and temporal lobes and increased ReHo in the bilateral caudate. Compared with the non-HE, MHE patients showed decreased ReHo in the bilateral precuneus, cuneus and supplementary motor area (SMA. The NCT-A of cirrhotic patients negatively correlated with ReHo values in the precuneus, cuneus and lingual gyrus. DST scores positively correlated with ReHo values in the cuneus, precuneus and lingual gyrus, and negatively correlated with ReHo values in the bilateral caudate (P<0.05, AlphaSim corrected. CONCLUSIONS/SIGNIFICANCE: Diffused abnormal homogeneity of baseline brain activity was nonspecific for MHE, and only the progressively decreased ReHo in the SMA and the cuneus, especially for the latter, might be associated with the development of MHE. The ReHo analysis may be potentially valuable for detecting the development from non-HE to MHE.

  10. A Comparative Study of the Effect of Homogeneous and Heterogeneous Collaborative Interaction on the Development of EFL Learners’ Writing Skill

    Directory of Open Access Journals (Sweden)

    Parviz Maftoon

    2009-05-01

    Full Text Available This study investigates the effect of homogeneous and heterogeneous peer interaction on the development of Iranian EFL learners’ writing skill. Sixty female students of TEFL participated in the study. The participants were divided into two groups based on their English proficiency test scores. The homogeneous group consisted of 14 participants paired with partners with similar English proficiency test scores, while the heterogeneous group consisted of 16 participants who were paired with partners who had higher test scores. The pairs had interaction and peer collaboration before carrying out three types of writing tasks. The Repeated Measures ANOVA was used to compare the student writers’ pretest writing scores with their three post-test scores. The results showed that both groups, very similarly, had significantly higher post-test scores in all three writing tasks. The findings are explained based on the sociocultural theory and Vygotsky’s notion of the zone of proximal development (ZPD. The study offers several important pedagogical implications and suggestions for further research.

  11. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  12. The development of thermal models for a UF6 transport container in a fully engulfing fire

    International Nuclear Information System (INIS)

    Lomas, J.; Clayton, D.G.

    1993-01-01

    This paper describes the recent development work on a lumped-parameter model known as BURST3 created by BNFL to examine the physics of the heating problem. The predictions of this model were compared with the results obtained by Mallett in 1965, in which small (3.5, 5 and 8 inch diameter) cylinders were exposed to a fire. In general, the comparison is good; however there are some differences - particularly on the speed of response of the wall temperature to the heating from the fire. The model was further modified to allow conditions of partial and full insulation to be investigated. The partially insulated condition simulates the Japanese proposal to insulate the ends of the container only, leaving the cylinder bare between the stiffening rings. The results obtained with our modified model support the predictions of Abe et al that the partially-insulated cylinder will survive the fire test. The analysis of a completely insulated container has indicated that a minimal thickness of insulation provides sufficient protection to allow survival in the fire test. A discussion of additional improvements to the lumped-parameter model are presented. (J.P.N.)

  13. The development of fully dynamic rotating machine models for nuclear training simulators

    International Nuclear Information System (INIS)

    Birsa, J.J.

    1990-01-01

    Prior to beginning the development of an enhanced set of electrical plant models for several nuclear training simulators, an extensive literature search was conducted to evaluate and select rotating machine models for use on these simulators. These models include the main generator, diesel generators, in-plant electric power distribution and off-side power. Form the results of this search, various models were investigated and several were selected for further evaluation. Several computer studies were performed on the selected models in order to determine their suitability for use in a training simulator environment. One surprising result of this study was that a number of established, classical models could not be made to reproduce actual plant steady-state data over the range necessary for a training simulator. This evaluation process and its results are presented in this paper. Various historical, as well as contemporary, electrical models of rotating machines are discussed. Specific criteria for selection of rotating machine models for training simulator use are presented

  14. Developing the theory of nonstationary neutron transport in a homogeneous infinite medium. γk coefficients

    International Nuclear Information System (INIS)

    Trukhanov, G.Ya.

    2005-01-01

    Time-dependent neutron transport theory of G.Ya. Trukhanov and S.A. Podosenov is developed. Errors of calculating of power series expansion coefficients, γ k , in this theory were estimated. It has been found that power series convergence radius R=|χ 1,2 |= 0.9595. Power series convergence speed were estimated [ru

  15. The development of a small inherently safe homogeneous reactor for the production of medical isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Carlin, G.E.; Bonin, H.W., E-mail: george.carlin@rmc.ca [Royal Military College of Canada, Kingston, Ontario (Canada)

    2013-07-01

    The use of radioisotopes for various procedures in the health care industry has become one of the most important practices in medicine. New interest has been found in the use of liquid fueled nuclear reactors to produce these isotopes due to the ease of fuel processing and ability to efficiently use LEU as the fuel source. A version of this reactor is being developed at the Royal Military College of Canada to act as a successor to the SLOWPOKE-2 platform. The thermal hydraulic and transient characteristics of a 20 kWt version are being studied to verify inherent safety abilities. (author)

  16. Development of K-Basin High-Strength Homogeneous Sludge Simulants and Correlations Between Unconfined Compressive Strength and Shear Strength

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Yasuo; Baer, Ellen BK; Chun, Jaehun; Yokuda, Satoru T.; Schmidt, Andrew J.; Sande, Susan; Buchmiller, William C.

    2011-02-20

    potential for erosion, it is important to compare the measured shear strength to penetrometer measurements and to develop a correlation (or correlations) between UCS measured by a pocket penetrometer and direct shear strength measurements for various homogeneous and heterogeneous simulants. This study developed 11 homogeneous simulants, whose shear strengths vary from 4 to 170 kPa. With these simulants, we developed correlations between UCS measured by a Geotest E-280 pocket penetrometer and shear strength values measured by a Geonor H-60 hand-held vane tester and a more sophisticated bench-top unit, the Haake M5 rheometer. This was achieved with side-by-side measurements of the shear strength and UCS of the homogeneous simulants. The homogeneous simulants developed under this study consist of kaolin clay, plaster of Paris, and amorphous alumina CP-5 with water. The simulants also include modeling clay. The shear strength of most of these simulants is sensitive to various factors, including the simulant size, the intensity of mixing, and the curing time, even with given concentrations of simulant components. Table S.1 summarizes these 11 simulants and their shear strengths.

  17. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    . Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  18. Homogeneous groups of plants, development scenarios, and basic configurations on the cogeneration systems optimization from the alcohol sector

    International Nuclear Information System (INIS)

    Silva Walter, A.C. da; Bajay, S.V.; Carrillo, J.L.L.

    1990-01-01

    The evaluation of introducing or diffusing new technologies at a macro economic level using micro economic information can be carried out through the careful selection of a small number of homogeneous groups of plants from the point of view of the main technical parameters being considered. In this paper this concept is applied to the study of cogeneration in sugar and alcohol producing plants. The statistical techniques of Cluster Analysis, regressions and mean value testing are used. Basic cogeneration plant designs are proposed for alternatives development scenarios for this industrial branch. These scenarios are based upon differing assumptions about the expansion of alcohol market, use of surplus sugar cane bagasse as saleable commodity, as a fuel or raw material, and price expectations for the sale of surplus power from the cogeneration plants to the local grid. (author)

  19. Homogeneous Subgroups of Young Children with Autism Improve Phenotypic Characterization in the Study to Explore Early Development.

    Science.gov (United States)

    Wiggins, Lisa D; Tian, Lin H; Levy, Susan E; Rice, Catherine; Lee, Li-Ching; Schieve, Laura; Pandey, Juhi; Daniels, Julie; Blaskey, Lisa; Hepburn, Susan; Landa, Rebecca; Edmondson-Pretzel, Rebecca; Thompson, William

    2017-11-01

    The objective of this study was to identify homogenous classes of young children with autism spectrum disorder (ASD) to improve phenotypic characterization. Children were enrolled in the Study to Explore Early Development between 2 and 5 years of age. 707 children were classified with ASD after a comprehensive evaluation with strict diagnostic algorithms. Four classes of children with ASD were identified from latent class analysis: mild language delay with cognitive rigidity, mild language and motor delay with dysregulation, general developmental delay, and significant developmental delay with repetitive motor behaviors. We conclude that a four-class phenotypic model of children with ASD best describes our data and improves phenotypic characterization of young children with ASD. Implications for screening, diagnosis, and research are discussed.

  20. Preparation of the new certified reference material Virginia Tobacco Leaves (CTA-VTL-2) and development of suitable methods for checking the homogeneity

    International Nuclear Information System (INIS)

    Dybczynski, R.; Polkowska-Motrenko, H.; Samczynski, Z.; Szopa, Z.

    1994-01-01

    The aim of this project in the long run has been reparation of a new biological reference material: Tobacco leaves of the 'Virginia' type and its certification for the content of possibly great number of trace elements. Further aims have been: development of the suitable methods for checking the homogeneity with the special emphasis on homogeneity of small samples and the critical analysis of the performance of various analytical techniques

  1. Onset of nucleate boiling and onset of fully developed subcooled boiling detection using pressure transducers signals spectral analysis

    International Nuclear Information System (INIS)

    Maprelian, Eduardo; Castro, Alvaro Alvim de; Ting, Daniel Kao Sun

    1999-01-01

    The experimental technique used for detection of subcooled boiling through analysis of the fluctuation contained in pressure transducers signals is presented. The experimental part of this work was conducted at the Institut fuer Kerntechnik und zertoerungsfreie Pruefverfahren von Hannover (IKPH, Germany) in a thermal-hydraulic circuit with one electrically heated rod with annular geometry test section. Piezo resistive pressure sensors are used for onset of nucleate boiling (ONB) and onset of fully developed boiling (OFDB) detection using spectral analysis/signal correlation techniques. Experimental results are interpreted by phenomenological analysis of these two points and compared with existing correlation. The results allows us to conclude that this technique is adequate for the detection and monitoring of the ONB and OFDB. (author)

  2. Fully developed laminar flow of non-Newtonian liquids through annuli: comparison of numerical calculations with experiments

    Energy Technology Data Exchange (ETDEWEB)

    Escudier, M.P.; Smith, S. [Department of Engineering, Mechanical Engineering, University of Liverpool, Brownlow Hill, Liverpool L69 3GH (United Kingdom); Oliveira, P.J. [Departamento de Engenharia Electromecanica, Universidade da Beira Interior, Rua Marques D' Avila e Boloma, 6200 Covilha (Portugal); Pinho, F.T. [Centro de Estudos de Fenomenos de Transporte, DEMEGI, Faculdade de Engenharia, Universidade do Porto, Rua Roberto Frias, 4200-465 Porto (Portugal)

    2002-07-01

    Experimental data are reported for fully developed laminar flow of a shear-thinning liquid through both a concentric and an 80% eccentric annulus with and without centrebody rotation. The working fluid was an aqueous solution of 0.1% xanthan gum and 0.1% carboxymethylcellulose for which the flow curve is well represented by the Cross model. Comparisons are reported between numerical calculations and the flow data, as well as with other laminar annular-flow data for a variety of shear-thinning liquids previously reported in the literature. In general, the calculations are in good quantitative agreement with the experimental data, even in situations where viscoelastic effects, neglected in the calculations, would be expected to play a role. (orig.)

  3. An Experimental and Simulation Study of Early Flame Development in a Homogeneous-charge Spark-Ignition Engine

    Directory of Open Access Journals (Sweden)

    Shekhawat Y.

    2017-09-01

    Full Text Available An integrated experimental and Large-Eddy Simulation (LES study is presented for homogeneous premixed combustion in a spark-ignition engine. The engine is a single-cylinder two-valve optical research engine with transparent liner and piston: the Transparent Combustion Chamber (TCC engine. This is a relatively simple, open engine configuration that can be used for LES model development and validation by other research groups. Pressure-based combustion analysis, optical diagnostics and LES have been combined to generate new physical insight into the early stages of combustion. The emphasis has been on developing strategies for making quantitative comparisons between high-speed/high-resolution optical diagnostics and LES using common metrics for both the experiments and the simulations, and focusing on the important early flame development period. Results from two different LES turbulent combustion models are presented, using the same numerical methods and computational mesh. Both models yield Cycle-to-Cycle Variations (CCV in combustion that are higher than what is observed in the experiments. The results reveal strengths and limitations of the experimental diagnostics and the LES models, and suggest directions for future diagnostic and simulation efforts. In particular, it has been observed that flame development between the times corresponding to the laminar-to-turbulent transition and 1% mass-burned fraction are especially important in establishing the subsequent combustion event for each cycle. This suggests a range of temporal and spatial scales over which future experimental and simulation efforts should focus.

  4. Homogeneous development and segregation - Power dynamics in the urban context: San Jose project case of Manizales city

    International Nuclear Information System (INIS)

    Noguera de Echeverri, Ana Patricia; Gomez Sanchez, Diana Marcela

    2013-01-01

    This article seeks to show specific situations in which power is mobilized by urban dynamics in the context of development as discourse generator homogeneous models of the city. These models are imposed on local contexts to generate economic progress, but their implementation is linked to urban conflicts related with segregation and social exclusion. This aspect points out the inconsistency between the global discourses of development, with local conditions of communities facing directly the results of their application. The arguments presented below are the result of various investigations carried out in the research group Environmental Thought at the National University of Colombia, Manizales headquarters in the context of the environmental crisis, development and the urban environment. In the period 2011-2012, we addressed the topic of the environmental and aesthetic configurations of the city of Manizales, in terms of spatial planning and urban living. This research is the most concrete support of the contextual references expressed in this article, which are based on a strong fieldwork addressed since different social sectors of the city.

  5. Prediction of gas volume fraction in fully-developed gas-liquid flow in a vertical pipe

    Energy Technology Data Exchange (ETDEWEB)

    Islam, A.S.M.A.; Adoo, N.A.; Bergstrom, D.J., E-mail: nana.adoo@usask.ca [University of Saskatchewan, Department of Mechanical Engineering, Saskatoon, SK (Canada); Wang, D.F. [Canadian Nuclear Laboratories, Chalk River, ON (Canada)

    2015-07-01

    An Eulerian-Eulerian two-fluid model has been implemented for the prediction of the gas volume fraction profile in turbulent upward gas-liquid flow in a vertical pipe. The two-fluid transport equations are discretized using the finite volume method and a low Reynolds number κ-ε turbulence model is used to predict the turbulence field for the liquid phase. The contribution to the effective turbulence by the gas phase is modeled by a bubble induced turbulent viscosity. For the fully-developed flow being considered, the gas volume fraction profile is calculated using the radial momentum balance for the bubble phase. The model potentially includes the effect of bubble size on the interphase forces and turbulence model. The results obtained are in good agreement with experimental data from the literature. The one-dimensional formulation being developed allows for the efficient assessment and further development of both turbulence and two-fluid models for multiphase flow applications in the nuclear industry. (author)

  6. Prediction of gas volume fraction in fully-developed gas-liquid flow in a vertical pipe

    International Nuclear Information System (INIS)

    Islam, A.S.M.A.; Adoo, N.A.; Bergstrom, D.J.; Wang, D.F.

    2015-01-01

    An Eulerian-Eulerian two-fluid model has been implemented for the prediction of the gas volume fraction profile in turbulent upward gas-liquid flow in a vertical pipe. The two-fluid transport equations are discretized using the finite volume method and a low Reynolds number κ-ε turbulence model is used to predict the turbulence field for the liquid phase. The contribution to the effective turbulence by the gas phase is modeled by a bubble induced turbulent viscosity. For the fully-developed flow being considered, the gas volume fraction profile is calculated using the radial momentum balance for the bubble phase. The model potentially includes the effect of bubble size on the interphase forces and turbulence model. The results obtained are in good agreement with experimental data from the literature. The one-dimensional formulation being developed allows for the efficient assessment and further development of both turbulence and two-fluid models for multiphase flow applications in the nuclear industry. (author)

  7. The Qatar National Historic Environment Record: a Platform for the Development of a Fully-Integrated Cultural Heritage Management Application

    Science.gov (United States)

    Cuttler, R. T. H.; Tonner, T. W. W.; Al-Naimi, F. A.; Dingwall, L. M.; Al-Hemaidi, N.

    2013-07-01

    The development of the Qatar National Historic Environment Record (QNHER) by the Qatar Museums Authority and the University of Birmingham in 2008 was based on a customised, bilingual Access database and ArcGIS. While both platforms are stable and well supported, neither was designed for the documentation and retrieval of cultural heritage data. As a result it was decided to develop a custom application using Open Source code. The core module of this application is now completed and is orientated towards the storage and retrieval of geospatial heritage data for the curation of heritage assets. Based on MIDAS Heritage data standards and regionally relevant thesauri, it is a truly bilingual system. Significant attention has been paid to the user interface, which is userfriendly and intuitive. Based on a suite of web services and accessed through a web browser, the system makes full use of internet resources such as Google Maps and Bing Maps. The application avoids long term vendor ''tie-ins'' and as a fully integrated data management system, is now an important tool for both cultural resource managers and heritage researchers in Qatar.

  8. Resistance calculation of un-fully developed two-phase flow through high differential pressure regulating valves

    International Nuclear Information System (INIS)

    Xu Mingyang; Wang Wenran; Wang Jiaying

    1999-01-01

    To reduce the flow velocity in the high differential pressure regulating valve with labyrinth. A type of complicated valve core structure were designed with tortuous flow path made from reversal double elbows. It is very difficult to calculate the pressure-drop of the un-fully developed two-phase flow under high temperature and pressure which flow through the valve core. A calculation method called 'constant (varing) pressure-drop progressing step by step design method' was developed. The complicated flow path was disintegrated into a series of independent resistance units and with the valve stem end progressing step by step the dimensions of the flow path were designed in accordance with the principle that in every position the total pressure-drop of the valve should amount to that required by the design goal curve. In the course of calculating the total pressure-drop, the valve flow path was also divided into a series of independent resistance units. The experiment results show that design flow characteristics are approximately consistent with the flow characteristics measured in the test

  9. Development of the System Dynamics Code using Homogeneous Equilibrium Model for S-CO{sub 2} Brayton cycle Transient Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Seong Jun; Lee, Won Woong; Oh, Bongseong; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    The features of the S-CO{sub 2} Brayton cycle come from a small compressing work by designing the compressor inlet close the critical point of CO{sub 2}. This means the system condition can be operating under two-phase or sub-critical phase during transient situations such as changes of cooling system performance, load variations, etc. Since there is no operating MW scale S-CO{sub 2} Brayton cycle system in the world yet, using an analytical code is the only way to predict the system behavior and develop operating strategies of the S-CO{sub 2} Brayton cycles. Therefore, the development of a credible system code is an important part for the practical S-CO{sub 2} system research. The current status of the developed system analysis code for S-CO{sub 2} Brayton cycle transient analyses in KAIST and verification results are presented in this paper. To avoid errors related with convergences of the code during the phase changing flow calculation in GAMMA+ code, the authors have developed a system analysis code using Homogeneous Equilibrium Model (HEM) for the S-CO{sub 2} Brayton cycle transient analysis. The backbone of the in-house code is the GAMMA+1.0 code, but treating the quality of fluid by tracking system enthalpy gradient every time step. Thus, the code adopts pressure and enthalpy as the independent scalar variables to track the system enthalpy for updating the quality of the system every time step. The heat conduction solving method, heat transfer correlation and frictional losses on the pipe are referred from the GAMMA+ code.

  10. Fully developed natural convection heat and mass transfer in a vertical annular porous medium with asymmetric wall temperatures and concentrations

    International Nuclear Information System (INIS)

    Cheng, C.-Y.

    2006-01-01

    This work examines the effects of the modified Darcy number, the buoyancy ratio and the inner radius-gap ratio on the fully developed natural convection heat and mass transfer in a vertical annular non-Darcy porous medium with asymmetric wall temperatures and concentrations. The exact solutions for the important characteristics of fluid flow, heat transfer, and mass transfer are derived by using a non-Darcy flow model. The modified Darcy number is related to the flow resistance of the porous matrix. For the free convection heat and mass transfer in an annular duct filled with porous media, increasing the modified Darcy number tends to increase the volume flow rate, total heat rate added to the fluid, and the total species rate added to the fluid. Moreover, an increase in the buoyancy ratio or in the inner radius-gap ratio leads to an increase in the volume flow rate, the total heat rate added to the fluid, and the total species rate added to the fluid

  11. Mixed Convective Fully Developed Flow in a Vertical Channel in the Presence of Thermal Radiation and Viscous Dissipation

    Directory of Open Access Journals (Sweden)

    Prasad K.V.

    2017-02-01

    Full Text Available The effect of thermal radiation and viscous dissipation on a combined free and forced convective flow in a vertical channel is investigated for a fully developed flow regime. Boussinesq and Roseseland approximations are considered in the modeling of the conduction radiation heat transfer with thermal boundary conditions (isothermal-thermal, isoflux-thermal, and isothermal-flux. The coupled nonlinear governing equations are also solved analytically using the Differential Transform Method (DTM and regular perturbation method (PM. The results are analyzed graphically for various governing parameters such as the mixed convection parameter, radiation parameter, Brinkman number and perturbation parameter for equal and different wall temperatures. It is found that the viscous dissipation enhances the flow reversal in the case of a downward flow while it counters the flow in the case of an upward flow. A comparison of the Differential Transform Method (DTM and regular perturbation method (PM methods shows the versatility of the Differential Transform Method (DTM. The skin friction and the wall temperature gradient are presented for different values of the physical parameters and the salient features are analyzed.

  12. Identification of flow structures in fully developed canonical and wavy channels by means of modal decomposition techniques

    Science.gov (United States)

    Ghebali, Sacha; Garicano-Mena, Jesús; Ferrer, Esteban; Valero, Eusebio

    2018-04-01

    A Dynamic Mode Decomposition (DMD) of Direct Numerical Simulations (DNS) of fully developed channel flows is undertaken in order to study the main differences in flow features between a plane-channel flow and a passively “controlled” flow wherein the mean friction was reduced relative to the baseline by modifying the geometry in order to generate a streamwise-periodic spanwise pressure gradient, as is the case for an oblique wavy wall. The present analysis reports POD and DMD modes for the plane channel, jointly with the application of a sparsity-promoting method, as well as a reconstruction of the Reynolds shear stress with the dynamic modes. Additionally, a dynamic link between the streamwise velocity fluctuations and the friction on the wall is sought by means of a composite approach both in the plane and wavy cases. One of the DMD modes associated with the wavy-wall friction exhibits a meandering motion which was hardly identifiable on the instantaneous friction fluctuations.

  13. Electroviscous effects in steady fully developed flow of a power-law liquid through a cylindrical microchannel

    International Nuclear Information System (INIS)

    Bharti, Ram P.; Harvie, Dalton J.E.; Davidson, Malcolm R.

    2009-01-01

    Electroviscous effects in steady, fully developed, pressure-driven flow of power-law liquids through a uniform cylindrical microchannel have been investigated numerically by solving the Poisson-Boltzmann and the momentum equations using a finite difference method. The pipe wall is considered to have uniform surface charge density and the liquid is assumed to be a symmetric 1:1 electrolyte solution. Electroviscous resistance reduces the velocity adjacent to the wall, relative to the velocity on the axis. The effect is shown to be greater when the liquid is shear-thinning, and less when it is shear-thickening, than it is for Newtonian flow. For overlapping electrical double layers and elevated surface charge density, the electroviscous reduction in the near-wall velocity can form an almost stationary (zero shear) layer there when the liquid is shear-thinning. In that case, the liquid behaves approximately as if it is flowing through a channel of reduced diameter. The induced axial electrical field shows only a weak dependence on the power-law index with the dependence being greatest for shear-thinning liquids. This field exhibits a local maximum as surface charge density increases from zero, even though the corresponding electrokinetic resistance increases monotonically. The magnitude of the electroviscous effect on the apparent viscosity, as measured by the ratio of the apparent and physical consistency indices, decreases monotonically as the power-law index increases. Thus, overall, the electroviscous effect is stronger in shear-thinning, and weaker in shear-thickening liquids, than it is when the liquid is Newtonian.

  14. Isogeometric variational multiscale large-eddy simulation of fully-developed turbulent flow over a wavy wall

    KAUST Repository

    Chang, Kyungsik

    2012-09-01

    We report on the isogeometric residual-based variational multiscale (VMS) large eddy simulation of a fully developed turbulent flow over a wavy wall. To assess the predictive capability of the VMS modeling framework, we compare its predictions against the results from direct numerical simulation (DNS) and large eddy simulation (LES) and, when available, against experimental measurements. We use C 1 quadratic B-spline basis functions to represent the smooth geometry of the sinusoidal lower wall and the solution variables. The Reynolds numbers of the flows considered are 6760 and 30,000 based on the bulk velocity and average channel height. The ratio of amplitude to wavelength (α/λ) of the sinusoidal wavy surface is set to 0.05. The computational domain is 2λ×1.05λ×λ in the streamwise, wall-normal and spanwise directions, respectively. For the Re=6760 case, mean averaged quantities, including velocity and pressure profiles, and the separation/reattachment points in the recirculation region, are compared with DNS and experimental data. The turbulent kinetic energy and Reynolds stress are in good agreement with benchmark data. Coherent structures over the wavy wall are observed in isosurfaces of the Q-criterion and show similar features to those previously reported in the literature. Comparable accuracy to DNS solutions is obtained with at least one order of magnitude fewer degrees of freedom. For the Re=30,000 case, good agreement was obtained for mean wall shear stress and velocity profiles compared with available LES results reported in the literature. © 2012 Elsevier Ltd.

  15. Isogeometric variational multiscale large-eddy simulation of fully-developed turbulent flow over a wavy wall

    KAUST Repository

    Chang, Kyungsik; Hughes, Thomas Jr R; Calo, Victor M.

    2012-01-01

    We report on the isogeometric residual-based variational multiscale (VMS) large eddy simulation of a fully developed turbulent flow over a wavy wall. To assess the predictive capability of the VMS modeling framework, we compare its predictions against the results from direct numerical simulation (DNS) and large eddy simulation (LES) and, when available, against experimental measurements. We use C 1 quadratic B-spline basis functions to represent the smooth geometry of the sinusoidal lower wall and the solution variables. The Reynolds numbers of the flows considered are 6760 and 30,000 based on the bulk velocity and average channel height. The ratio of amplitude to wavelength (α/λ) of the sinusoidal wavy surface is set to 0.05. The computational domain is 2λ×1.05λ×λ in the streamwise, wall-normal and spanwise directions, respectively. For the Re=6760 case, mean averaged quantities, including velocity and pressure profiles, and the separation/reattachment points in the recirculation region, are compared with DNS and experimental data. The turbulent kinetic energy and Reynolds stress are in good agreement with benchmark data. Coherent structures over the wavy wall are observed in isosurfaces of the Q-criterion and show similar features to those previously reported in the literature. Comparable accuracy to DNS solutions is obtained with at least one order of magnitude fewer degrees of freedom. For the Re=30,000 case, good agreement was obtained for mean wall shear stress and velocity profiles compared with available LES results reported in the literature. © 2012 Elsevier Ltd.

  16. Code development for analysis of MHD pressure drop reduction in a liquid metal blanket using insulation technique based on a fully developed flow model

    International Nuclear Information System (INIS)

    Smolentsev, Sergey; Morley, Neil; Abdou, Mohamed

    2005-01-01

    The paper presents details of a new numerical code for analysis of a fully developed MHD flow in a channel of a liquid metal blanket using various insulation techniques. The code has specially been designed for channels with a 'sandwich' structure of several materials with different physical properties. The code includes a finite-volume formulation, automatically generated Hartmann number sensitive meshes, and effective convergence acceleration technique. Tests performed at Ha ∼ 10 4 have showed very good accuracy. As an illustration, two blanket flows have been considered: Pb-17Li flow in a channel with a silicon carbide flow channel insert, and Li flow in a channel with insulating coating

  17. Student Centered Homogeneous Ability Grouping: Using Bronfenbrenner's Theory of Human Development to Investigate the Ecological Factors Contributing to the Academic Achievement of High School Students in Mathematics

    Science.gov (United States)

    Webb, Karla Denise

    2011-01-01

    The purpose of this qualitative study was to explore the interconnectedness of the environment, human development, and the factors that influence students' academic performance in a homogeneous ability grouped mathematics classroom. The study consisted of four African American urban high school juniors, 2 male and 2 female. During the 12 week…

  18. The Effect of Rolling As-Cast and Homogenized U-10Mo Samples on the Microstructure Development and Recovery Curves

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Vineet V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Paxton, Dean M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burkes, Douglas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-07-30

    Over the past several years Pacific Northwest National Laboratory (PNNL) has been actively involved in supporting the U.S. Department of Energy National Nuclear Security Administration Office of Material Management and Minimization (formerly Global Threat Reduction Initiative). The U.S. High- Power Research Reactor (USHPRR) project is developing alternatives to existing highly enriched uranium alloy fuel to reduce the proliferation threat. One option for a high-density metal fuel is uranium alloyed with 10 wt% molybdenum (U-10Mo). Forming the U-10Mo fuel plates/foils via rolling is an effective technique and is actively being pursued as part of the baseline manufacturing process. The processing of these fuel plates requires systematic investigation/understanding of the pre- and post-rolling microstructure, end-state mechanical properties, residual stresses, and defects, their effect on the mill during processing, and eventually, their in-reactor performance. In the work documented herein, studies were conducted to determine the effect of cold and hot rolling the as-cast and homogenized U-10Mo on its microstructure and hardness. The samples were homogenized at 900°C for 48 h, then later annealed for several durations and temperatures to investigate the effect on the material’s microstructure and hardness. The rolling of the as-cast plate, both hot and cold, was observed to form a molybdenum-rich and -lean banded structure. The cold rolling was ineffective, and in some cases exacerbated the as-cast defects. The grains elongated along the rolling direction and formed a pancake shape, while the carbides fractured perpendicularly to the rolling direction and left porosity between fractured particles of UC. The subsequent annealing of these samples at sub-eutectoid temperatures led to rapid precipitation of the ' lamellar phase, mainly in the molybdenum-lean regions. Annealing the samples above the eutectoid temperature did not refine the grain size or the banded

  19. Homogeneous crystal nucleation in polymers.

    Science.gov (United States)

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  20. Fully portable blood irradiator

    International Nuclear Information System (INIS)

    Hungate, F.P.; Riemath, W.F.; Bunnell, L.R.

    1980-01-01

    A fully portable blood irradiator was developed using the beta emitter thulium-170 as the radiation source and vitreous carbon as the body of the irradiator, matrix for isotope encapsulation, and blood interface material. These units were placed in exteriorized arteriovenous shunts in goats, sheep, and dogs and the effects on circulating lymphocytes and on skin allograft retention times measured. The present work extends these studies by establishing baseline data for skin graft rejection times in untreated animals

  1. Homogeneous Finsler Spaces

    CERN Document Server

    Deng, Shaoqiang

    2012-01-01

    "Homogeneous Finsler Spaces" is the first book to emphasize the relationship between Lie groups and Finsler geometry, and the first to show the validity in using Lie theory for the study of Finsler geometry problems. This book contains a series of new results obtained by the author and collaborators during the last decade. The topic of Finsler geometry has developed rapidly in recent years. One of the main reasons for its surge in development is its use in many scientific fields, such as general relativity, mathematical biology, and phycology (study of algae). This monograph introduc

  2. Renormalization group in the theory of fully developed turbulence. Problem of the infrared relevant corrections to the Navier-Stokes equation

    International Nuclear Information System (INIS)

    Antonov, N.V.; Borisenok, S.V.; Girina, V.I.

    1996-01-01

    Within the framework of the renormalization group approach to the theory of fully developed turbulence we consider the problem of possible IR relevant corrections to the Navier-Stokes equation. We formulate an exact criterion of the actual IR relevance of the corrections. In accordance with this criterion we verify the IR relevance for certain classes of composite operators. 17 refs., 2 tabs

  3. Development and application of an ultratrace method for speciation of organotin compounds in cryogenically archived and homogenized biological materials

    Energy Technology Data Exchange (ETDEWEB)

    Point, David; Davis, W.C.; Christopher, Steven J.; Ellisor, Michael B.; Pugh, Rebecca S.; Becker, Paul R. [Hollings Marine Laboratory, National Institute of Standards and Technology, Analytical Chemistry Division, Charleston, SC (United States); Donard, Olivier F.X. [Laboratoire de Chimie Analytique BioInorganique et Environnement UMR 5034 du CNRS, Pau (France); Porter, Barbara J.; Wise, Stephen A. [National Institute of Standards and Technology, Analytical Chemistry Division, Gaithersburg, MD (United States)

    2007-04-15

    An accurate, ultra-sensitive and robust method for speciation of mono, di, and tributyltin (MBT, DBT, and TBT) by speciated isotope-dilution gas chromatography-inductively coupled plasma-mass spectrometry (SID-GC-ICPMS) has been developed for quantification of butyltin concentrations in cryogenic biological materials maintained in an uninterrupted cryo-chain from storage conditions through homogenization and bottling. The method significantly reduces the detection limits, to the low pg g{sup -1} level (as Sn), and was validated by using the European reference material (ERM) CE477, mussel tissue, produced by the Institute for Reference Materials and Measurements. It was applied to three different cryogenic biological materials - a fresh-frozen mussel tissue (SRM 1974b) together with complex materials, a protein-rich material (whale liver control material, QC03LH03), and a lipid-rich material (whale blubber, SRM 1945) containing up to 72% lipids. The commutability between frozen and freeze-dried materials with regard to spike equilibration/interaction, extraction efficiency, and the absence of detectable transformations was carefully investigated by applying complementary methods and by varying extraction conditions and spiking strategies. The inter-method results enabled assignment of reference concentrations of butyltins in cryogenic SRMs and control materials for the first time. The reference concentrations of MBT, DBT, and TBT in SRM 1974b were 0.92 {+-} 0.06, 2.7 {+-} 0.4, and 6.58 {+-} 0.19 ng g{sup -1} as Sn (wet-mass), respectively; in SRM 1945 they were 0.38 {+-} 0.06, 1.19 {+-} 0.26, and 3.55 {+-} 0.44 ng g{sup -1}, respectively, as Sn (wet-mass). In QC03LH03, DBT and TBT concentrations were 30.0 {+-} 2.7 and 2.26 {+-} 0.38 ng g{sup -1} as Sn (wet-mass). The concentration range of butyltins in these materials is one to three orders of magnitude lower than in ERM CE477. This study demonstrated that cryogenically processed and stored biological materials are

  4. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    Science.gov (United States)

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization. © 2012 The Society for Applied Microbiology.

  5. Spinor structures on homogeneous spaces

    International Nuclear Information System (INIS)

    Lyakhovskii, V.D.; Mudrov, A.I.

    1993-01-01

    For multidimensional models of the interaction of elementary particles, the problem of constructing and classifying spinor fields on homogeneous spaces is exceptionally important. An algebraic criterion for the existence of spinor structures on homogeneous spaces used in multidimensional models is developed. A method of explicit construction of spinor structures is proposed, and its effectiveness is demonstrated in examples. The results are of particular importance for harmonic decomposition of spinor fields

  6. Design of arbitrarily homogeneous permanent magnet systems for NMR and MRI: theory and experimental developments of a simple portable magnet.

    Science.gov (United States)

    Hugon, Cedric; D'Amico, Francesca; Aubert, Guy; Sakellariou, Dimitris

    2010-07-01

    Starting from general results of magnetostatics, we give fundamental considerations on the design and characterization of permanent magnets for NMR based on harmonic analysis and symmetry. We then propose a simple geometry that takes advantage of some of these considerations and discuss the practical aspects of the assembly of a real magnet based on this geometry, involving the characterization of its elements, the optimization of the layout and the correction of residual inhomogeneities due to material and geometry imperfections. We report with this low-cost, light-weight magnet (100 euros and 1.8 kg including the aluminum frame) a field of 120 mT (5.1 MHz proton) with a 10 ppm natural homogeneity over a sphere of 1.5 mm in diameter. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  7. Split energy–helicity cascades in three-dimensional homogeneous and isotropic turbulence

    NARCIS (Netherlands)

    Biferale, L.; Musacchio, S.; Toschi, F.

    2013-01-01

    We investigate the transfer properties of energy and helicity fluctuations in fully developed homogeneous and isotropic turbulence by changing the nature of the nonlinear Navier–Stokes terms. We perform a surgery of all possible interactions, by keeping only those triads that have sign-definite

  8. High Pressure Homogenization of Porcine Pepsin Protease: Effects on Enzyme Activity, Stability, Milk Coagulation Profile and Gel Development

    Science.gov (United States)

    Leite Júnior, Bruno Ricardo de Castro; Tribst, Alline Artigiani Lima; Cristianini, Marcelo

    2015-01-01

    This study investigated the effect of high pressure homogenization (HPH) (up to 190 MPa) on porcine pepsin (proteolytic and milk-clotting activities), and the consequences of using the processed enzyme in milk coagulation and gel formation (rheological profile, proteolysis, syneresis, and microstructure). Although the proteolytic activity (PA) was not altered immediately after the HPH process, it reduced during enzyme storage, with a 5% decrease after 60 days of storage for samples obtained with the enzyme processed at 50, 100 and 150 MPa. HPH increased the milk-clotting activity (MCA) of the enzyme processed at 150 MPa, being 15% higher than the MCA of non-processed samples after 60 days of storage. The enzyme processed at 150 MPa produced faster aggregation and a more consistent milk gel (G’ value 92% higher after 90 minutes) when compared with the non-processed enzyme. In addition, the gels produced with the enzyme processed at 150 MPa showed greater syneresis after 40 minutes of coagulation (forming a more compact protein network) and lower porosity (evidenced by confocal microscopy). These effects on the milk gel can be associated with the increment in MCA and reduction in PA caused by the effects of HPH on pepsin during storage. According to the results, HPH stands out as a process capable of changing the proteolytic characteristics of porcine pepsin, with improvements on the milk coagulation step and gel characteristics. Therefore, the porcine pepsin submitted to HPH process can be a suitable alternative for the production of cheese. PMID:25938823

  9. Advanced TEM Characterization for the Development of 28-14nm nodes based on fully-depleted Silicon-on-Insulator Technology

    International Nuclear Information System (INIS)

    Servanton, G; Clement, L; Lepinay, K; Lorut, F; Pantel, R; Pofelski, A; Bicais, N

    2013-01-01

    The growing demand for wireless multimedia applications (smartphones, tablets, digital cameras) requires the development of devices combining both high speed performances and low power consumption. A recent technological breakthrough making a good compromise between these two antagonist conditions has been proposed: the 28-14nm CMOS transistor generations based on a fully-depleted Silicon-on-Insulator (FD-SOI) performed on a thin Si film of 5-6nm. In this paper, we propose to review the TEM characterization challenges that are essential for the development of extremely power-efficient System on Chip (SoC)

  10. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    Science.gov (United States)

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  11. Development of a fully-coupled, all atates, all hazards level 2 PSA at leibstadt nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Zvoncek, Pavol; Nusbaumer, Olivier [Safety Compliance and Technical Support Department, Leibstadt Nuclear Power Plant, Leibstadt (Sweden); Torri, Alfred [Risk Management Associates, Inc., Encinitas (United States)

    2017-03-15

    This paper describes the development process, the innovative techniques used and insights gained from the latest integrated, full scope, multistate Level 2 PSA analysis conducted at the Leibstadt Nuclear Power Plant (KKL), Switzerland. KKL is a modern single-unit General Electric Boiling Water Reactor (BWR/6) with Mark III Containment, and a power output of 3600MWth/1200MWe, the highest among the five operating reactors in Switzerland. A Level 2 Probabilistic Safety Assessment (PSA) analyses accident phenomena in nuclear power plants, identifies ways in which radioactive releases from plants can occur and estimates release pathways, magnitude and frequency. This paper attempts to give an overview of the advanced modeling techniques that have been developed and implemented for the recent KKL Level 2 PSA update, with the aim of systematizing the analysis and modeling processes, as well as complying with the relatively prescriptive Swiss requirements for PSA. The analysis provides significant insights into the absolute and relative importance of risk contributors and accident prevention and mitigation measures. Thanks to several newly developed techniques and an integrated approach, the KKL Level 2 PSA report exhibits a high degree of reviewability and maintainability, and transparently highlights the most important risk contributors to Large Early Release Frequency (LERF) with respect to initiating events, components, operator actions or seismic component failure probabilities (fragilities)

  12. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  13. Development of and feedback on a fully automated virtual reality system for online training in weight management skills.

    Science.gov (United States)

    Thomas, J Graham; Spitalnick, Josh S; Hadley, Wendy; Bond, Dale S; Wing, Rena R

    2015-01-01

    Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. © 2014 Diabetes Technology Society.

  14. Status of the development of a fully integrated code system for the simulation of high temperature reactor cores

    Energy Technology Data Exchange (ETDEWEB)

    Kasselmann, Stefan, E-mail: s.kasselmann@fz-juelich.de [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Druska, Claudia [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Herber, Stefan [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany); Jühe, Stephan [Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany); Keller, Florian; Lambertz, Daniela; Li, Jingjing; Scholthaus, Sarah; Shi, Dunfu [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Xhonneux, Andre; Allelein, Hans-Josef [Institute of Energy and Climate Research, Nuclear Waste Management and Reactor Safety (IEK-6), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Lehrstuhl für Reaktorsicherheit und -technik, RWTH Aachen, 52062 Aachen (Germany)

    2014-05-01

    The HTR code package (HCP) is a new code system, which couples a variety of stand-alone codes for the simulation of different aspects of HTR. HCP will allow the steady-state and transient operating conditions of a 3D reactor core to be simulated including new features such as spatially resolved fission product release calculations or production and transport of graphite dust. For this code the latest programming techniques and standards are applied. As a first step an object-oriented data model was developed which features a high level of readability because it is based on problem-specific data types like Nuclide, Reaction, ReactionHandler, CrossSectionSet, etc. Those classes help to encapsulate and therefore hide specific implementations, which are not relevant with respect to physics. HCP will make use of one consistent data library for which an automatic generation tool was developed. The new data library consists of decay information, cross sections, fission yields, scattering matrices etc. for all available nuclides (e.g. ENDF/B-VII.1). The data can be stored in different formats such as binary, ASCII or XML. The new burn up code TNT (Topological Nuclide Transmutation) applies graph theory to represent nuclide chains and to minimize the calculation effort when solving the burn up equations. New features are the use of energy-dependent fission yields or the calculation of thermal power for decay, fission and capture reactions. With STACY (source term analysis code system) the fission product release for steady state as well as accident scenarios can be simulated for each fuel batch. For a full-core release calculation several thousand fuel elements are tracked while passing through the core. This models the stochastic behavior of a pebble bed in a realistic manner. In this paper we report on the current status of the HCP and present first results, which prove the applicability of the selected approach.

  15. Homogenization of neutronic diffusion models

    International Nuclear Information System (INIS)

    Capdebosq, Y.

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  16. Functionality and homogeneity.

    NARCIS (Netherlands)

    2011-01-01

    Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass,

  17. Homogenization of Mammalian Cells.

    Science.gov (United States)

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  18. Spontaneous compactification to homogeneous spaces

    International Nuclear Information System (INIS)

    Mourao, J.M.

    1988-01-01

    The spontaneous compactification of extra dimensions to compact homogeneous spaces is studied. The methods developed within the framework of coset space dimensional reduction scheme and the most general form of invariant metrics are used to find solutions of spontaneous compactification equations

  19. The fully Mobile City Government Project (MCity)

    DEFF Research Database (Denmark)

    Scholl, Hans; Fidel, Raya; Mai, Jens Erik

    2006-01-01

    The Fully Mobile City Government Project, also known as MCity, is an interdisciplinary research project on the premises, requirements, and effects of fully mobile, wirelessly connected applications (FWMC). The project will develop an analytical framework for interpreting the interaction and inter......The Fully Mobile City Government Project, also known as MCity, is an interdisciplinary research project on the premises, requirements, and effects of fully mobile, wirelessly connected applications (FWMC). The project will develop an analytical framework for interpreting the interaction...

  20. The SPH homogeneization method

    International Nuclear Information System (INIS)

    Kavenoky, Alain

    1978-01-01

    The homogeneization of a uniform lattice is a rather well understood topic while difficult problems arise if the lattice becomes irregular. The SPH homogeneization method is an attempt to generate homogeneized cross sections for an irregular lattice. Section 1 summarizes the treatment of an isolated cylindrical cell with an entering surface current (in one velocity theory); Section 2 is devoted to the extension of the SPH method to assembly problems. Finally Section 3 presents the generalisation to general multigroup problems. Numerical results are obtained for a PXR rod bundle assembly in Section 4

  1. Homogeneity of Inorganic Glasses

    DEFF Research Database (Denmark)

    Jensen, Martin; Zhang, L.; Keding, Ralf

    2011-01-01

    Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...

  2. A literature review on biotic homogenization

    OpenAIRE

    Guangmei Wang; Jingcheng Yang; Chuangdao Jiang; Hongtao Zhao; Zhidong Zhang

    2009-01-01

    Biotic homogenization is the process whereby the genetic, taxonomic and functional similarity of two or more biotas increases over time. As a new research agenda for conservation biogeography, biotic homogenization has become a rapidly emerging topic of interest in ecology and evolution over the past decade. However, research on this topic is rare in China. Herein, we introduce the development of the concept of biotic homogenization, and then discuss methods to quantify its three components (...

  3. Study of the hovering period and bubble size in fully developed pool nucleate boiling of saturated liquid with a time-dependent heat source

    International Nuclear Information System (INIS)

    Pasamehmetoglu, K.O.; Nelson, R.A.

    1987-01-01

    In this paper, the bubble behavior in saturated pool boiling with a time-dependent heat source is analyzed. The study is restricted to the period from fully developed nucleate boiling until critical heat flux occurs. The hovering period and the departure volume of the bubble are selected as the characteristic parameters for bubble behavior. These parameters are quantified by solving the equation of motion for an idealized bubble. This equation is solved for cases in which the surface heat flux changes linearly and exponentially as a function of time. After nondimensionalization, the results are compared directly with the results of the steady-state problem. The comparison shows that the transient heat input has practically no effect on the hovering period. However, the transient heat flux causes a decreased volume at bubble departure. The volume decrease is dependent on the severity of the transient. These results are in qualitative agreement with the experimental observation quoted in the literature

  4. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  5. Hybrid diffusion–transport spatial homogenization method

    International Nuclear Information System (INIS)

    Kooreman, Gabriel; Rahnema, Farzad

    2014-01-01

    Highlights: • A new hybrid diffusion–transport homogenization method. • An extension of the consistent spatial homogenization (CSH) transport method. • Auxiliary cross section makes homogenized diffusion consistent with heterogeneous diffusion. • An on-the-fly re-homogenization in transport. • The method is faster than fine-mesh transport by 6–8 times. - Abstract: A new hybrid diffusion–transport homogenization method has been developed by extending the consistent spatial homogenization (CSH) transport method to include diffusion theory. As in the CSH method, an “auxiliary cross section” term is introduced into the source term, making the resulting homogenized diffusion equation consistent with its heterogeneous counterpart. The method then utilizes an on-the-fly re-homogenization in transport theory at the assembly level in order to correct for core environment effects on the homogenized cross sections and the auxiliary cross section. The method has been derived in general geometry and tested in a 1-D boiling water reactor (BWR) core benchmark problem for both controlled and uncontrolled configurations. The method has been shown to converge to the reference solution with less than 1.7% average flux error in less than one third the computational time as the CSH method – 6 to 8 times faster than fine-mesh transport

  6. Dynamics of homogeneous nucleation

    DEFF Research Database (Denmark)

    Toxværd, Søren

    2015-01-01

    The classical nucleation theory for homogeneous nucleation is formulated as a theory for a density fluctuation in a supersaturated gas at a given temperature. But molecular dynamics simulations reveal that it is small cold clusters which initiates the nucleation. The temperature in the nucleating...

  7. Homogeneous bilateral block shifts

    Indian Academy of Sciences (India)

    Douglas class were classified in [3]; they are unilateral block shifts of arbitrary block size (i.e. dim H(n) can be anything). However, no examples of irreducible homogeneous bilateral block shifts of block size larger than 1 were known until now.

  8. Homogeneity and Entropy

    Science.gov (United States)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  9. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  10. The effect of particle size and concentration on the flow properties of a homogeneous slurry

    International Nuclear Information System (INIS)

    Abbas, M.A.; Crowe, C.T.

    1986-01-01

    This paper presents the results of the effects of particle size and concentration on the velocity distribution in the fully developed flow of a homogeneous slurry. The slurry consisted of chloroform and silica gel with matched index of refraction to enable Laser-Doppler anemometry (LDA) measurements through the mixture. Slurries with two particle sizes and solids concentration up to 30% by volume were studied. Measurements were made over a Reynolds number range of 1,200 to 30,000

  11. WAVECALC: an Excel-VBA spreadsheet to model the characteristics of fully developed waves and their influence on bottom sediments in different water depths

    Science.gov (United States)

    Le Roux, Jacobus P.; Demirbilek, Zeki; Brodalka, Marysia; Flemming, Burghard W.

    2010-10-01

    The generation and growth of waves in deep water is controlled by winds blowing over the sea surface. In fully developed sea states, where winds and waves are in equilibrium, wave parameters may be calculated directly from the wind velocity. We provide an Excel spreadsheet to compute the wave period, length, height and celerity, as well as horizontal and vertical particle velocities for any water depth, bottom slope, and distance below the reference water level. The wave profile and propagation can also be visualized for any water depth, modeling the sea surface change from sinusoidal to trochoidal and finally cnoidal profiles into shallow water. Bedload entrainment is estimated under both the wave crest and the trough, using the horizontal water particle velocity at the top of the boundary layer. The calculations are programmed in an Excel file called WAVECALC, which is available online to authorized users. Although many of the recently published formulas are based on theoretical arguments, the values agree well with several existing theories and limited field and laboratory observations. WAVECALC is a user-friendly program intended for sedimentologists, coastal engineers and oceanographers, as well as marine ecologists and biologists. It provides a rapid means to calculate many wave characteristics required in coastal and shallow marine studies, and can also serve as an educational tool.

  12. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  13. Numerical simulations of heat transfer in an annular fuel channel with three-dimensional spacer ribs set up periodically under a fully developed turbulent flow

    International Nuclear Information System (INIS)

    Takase, Kazuyuki; Akino, Norio

    1996-06-01

    Thermal-hydraulic characteristics of an annular fuel channel with spacer ribs for high temperature gas-cooled reactors were analyzed numerically by three-dimensional heat transfer computations under a fully developed turbulent flow. The two-equations κ-ε turbulence model was applied to the present turbulent analysis. In particular, the κ-ε turbulence model constants and the turbulent Prandtl number were improved from the previous standard values proposed by Jones and Launder in order to obtain heat transfer predictions with higher accuracy. Consequently, heat transfer coefficients and friction factors in the spacer-ribbed fuel channel were predicted with sufficient accuracy in the range of Reynolds number exceeding 3000. It was clarified quantitatively from the present study that main mechanism for the heat transfer augmentation in the spacer-ribbed fuel channel was combined effects of the turbulence promoter effect by the spacer ribs and the velocity acceleration effect by a reduction in the channel cross-section. (author)

  14. Development and verification of a new wind speed forecasting system using an ensemble Kalman filter data assimilation technique in a fully coupled hydrologic and atmospheric model

    Science.gov (United States)

    Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle

    2013-12-01

    Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.

  15. Homogeneous group, research, institution

    Directory of Open Access Journals (Sweden)

    Francesca Natascia Vasta

    2014-09-01

    Full Text Available The work outlines the complex connection among empiric research, therapeutic programs and host institution. It is considered the current research state in Italy. Italian research field is analyzed and critic data are outlined: lack of results regarding both the therapeutic processes and the effectiveness of eating disorders group analytic treatment. The work investigates on an eating disorders homogeneous group, led into an eating disorder outpatient service. First we present the methodological steps the research is based on including the strong connection among theory and clinical tools. Secondly clinical tools are described and the results commented. Finally, our results suggest the necessity of validating some more specifical hypothesis: verifying the relationship between clinical improvement (sense of exclusion and painful emotions reduction and specific group therapeutic processes; verifying the relationship between depressive feelings, relapses and transition trough a more differentiated groupal field.Keywords: Homogeneous group; Eating disorders; Institutional field; Therapeutic outcome

  16. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  17. Homogen Mur - et udviklingsprojekt

    DEFF Research Database (Denmark)

    Dahl, Torben; Beim, Anne; Sørensen, Peter

    1997-01-01

    Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk.......Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk....

  18. Homogenization of resonant chiral metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....

  19. Benchmarking homogenization algorithms for monthly data

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  20. Homogeneous M2 duals

    International Nuclear Information System (INIS)

    Figueroa-O’Farrill, José; Ungureanu, Mara

    2016-01-01

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS 4 ×P 7 , with P riemannian and homogeneous under the action of SO(5), or S 4 ×Q 7 with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  1. Homogeneous M2 duals

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa-O’Farrill, José [School of Mathematics and Maxwell Institute for Mathematical Sciences,The University of Edinburgh,James Clerk Maxwell Building, The King’s Buildings, Peter Guthrie Tait Road,Edinburgh EH9 3FD, Scotland (United Kingdom); Ungureanu, Mara [Humboldt-Universität zu Berlin, Institut für Mathematik,Unter den Linden 6, 10099 Berlin (Germany)

    2016-01-25

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS{sub 4}×P{sup 7}, with P riemannian and homogeneous under the action of SO(5), or S{sup 4}×Q{sup 7} with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  2. Computational Method for Atomistic-Continuum Homogenization

    National Research Council Canada - National Science Library

    Chung, Peter

    2002-01-01

    The homogenization method is used as a framework for developing a multiscale system of equations involving atoms at zero temperature at the small scale and continuum mechanics at the very large scale...

  3. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. Android Fully Loaded

    CERN Document Server

    Huddleston, Rob

    2012-01-01

    Fully loaded with the latest tricks and tips on your new Android! Android smartphones are so hot, they're soaring past iPhones on the sales charts. And the second edition of this muscular little book is equally impressive--it's packed with tips and tricks for getting the very most out of your latest-generation Android device. Start Facebooking and tweeting with your Android mobile, scan barcodes to get pricing and product reviews, download your favorite TV shows--the book is positively bursting with practical and fun how-tos. Topics run the gamut from using speech recognition, location-based m

  5. HOMOGENEOUS NUCLEAR POWER REACTOR

    Science.gov (United States)

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  6. Homogeneity spoil spectroscopy

    International Nuclear Information System (INIS)

    Hennig, J.; Boesch, C.; Martin, E.; Grutter, R.

    1987-01-01

    One of the problems of in vivo MR spectroscopy of P-31 is spectra localization. Surface coil spectroscopy, which is the method of choice for clinical applications, suffers from the high-intensity signal from subcutaneous muscle tissue, which masks the spectrum of interest from deeper structures. In order to suppress this signal while maintaining the simplicity of surface coil spectroscopy, the authors introduced a small sheet of ferromagnetically dotted plastic between the surface coil and the body. This sheet destroys locally the field homogeneity and therefore all signal from structures around the coil. The very high reproducibility of the simple experimental procedure allows long-term studies important for monitoring tumor therapy

  7. Physical applications of homogeneous balls

    CERN Document Server

    Scarr, Tzvi

    2005-01-01

    One of the mathematical challenges of modern physics lies in the development of new tools to efficiently describe different branches of physics within one mathematical framework. This text introduces precisely such a broad mathematical model, one that gives a clear geometric expression of the symmetry of physical laws and is entirely determined by that symmetry. The first three chapters discuss the occurrence of bounded symmetric domains (BSDs) or homogeneous balls and their algebraic structure in physics. The book further provides a discussion of how to obtain a triple algebraic structure ass

  8. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.

  9. In-silico experiments on characteristic time scale at a shear-free gas-liquid interface in fully developed turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nagaosa, Ryuichi [Research Center for Compact Chemical System (CCS), AIST, 4-2-1 Nigatake, Miyagino, Sendai 983-8551 (Japan); Handler, Robert A, E-mail: ryuichi.nagaosa@aist.go.jp [Department of Mechanical Engineering, Texas A and M University, College Station, TX 77843-3123 (United States)

    2011-12-22

    The purpose of this study is to model scalar transfer mechanisms in a fully developed turbulence for accurate predictions of the turbulent scalar flux across a shear-free gas-liquid interface. The concept of the surface-renewal approximation (Dankwerts, 1951) is introduced in this study to establish the predictive models for the interfacial scalar flux. Turbulent flow realizations obtained by a direct numerical simulation technique are employed to prepare details of three-dimensional information on turbulence in the region very close to the interface. Two characteristic time scales at the interface have been examined for exact prediction of the scalar transfer flux. One is the time scale which is reciprocal of the root-mean-square surface divergence, T{sub {gamma}} = ({gamma}{gamma}){sup -1/2}, where {gamma} is the surface divergence. The other time scale to be examined is T{sub S} = {Lambda}/V, where {Lambda} is the zero-correlation length of the surface divergence as the interfacial length scale, and V is the root-mean-square velocity fluctuation in the streamwise direction as the interfacial velocity scale. The results of this study suggests that T{sub {gamma}} is slightly unsatisfactory to correlate the turbulent scalar flux at the gas-liquid interface based on the surface-renewal approximation. It is also found that the proportionality constant appear to be 0.19, which is different with that observed in the laboratory experiments, 0.34 (Komori, Murakami, and Ueda, 1989). It is concluded that the time scale, T{sub {gamma}}, is considered a different kind of the time scale observed in the laboratory experiments. On the other hand, the present in-silico experiments indicate that T{sub s} predicts the turbulent scalar flux based on the surface-renewal approximation in a satisfactory manner. It is also elucidated that the proportionality constant for T{sub s} is approximately 0.36, which is very close to that found by the laboratory experiments. This fact shows

  10. In-silico experiments on characteristic time scale at a shear-free gas-liquid interface in fully developed turbulence

    International Nuclear Information System (INIS)

    Nagaosa, Ryuichi; Handler, Robert A

    2011-01-01

    The purpose of this study is to model scalar transfer mechanisms in a fully developed turbulence for accurate predictions of the turbulent scalar flux across a shear-free gas-liquid interface. The concept of the surface-renewal approximation (Dankwerts, 1951) is introduced in this study to establish the predictive models for the interfacial scalar flux. Turbulent flow realizations obtained by a direct numerical simulation technique are employed to prepare details of three-dimensional information on turbulence in the region very close to the interface. Two characteristic time scales at the interface have been examined for exact prediction of the scalar transfer flux. One is the time scale which is reciprocal of the root-mean-square surface divergence, T γ = (γγ) −1/2 , where γ is the surface divergence. The other time scale to be examined is T S = Λ/V, where Λ is the zero-correlation length of the surface divergence as the interfacial length scale, and V is the root-mean-square velocity fluctuation in the streamwise direction as the interfacial velocity scale. The results of this study suggests that T γ is slightly unsatisfactory to correlate the turbulent scalar flux at the gas-liquid interface based on the surface-renewal approximation. It is also found that the proportionality constant appear to be 0.19, which is different with that observed in the laboratory experiments, 0.34 (Komori, Murakami, and Ueda, 1989). It is concluded that the time scale, T γ , is considered a different kind of the time scale observed in the laboratory experiments. On the other hand, the present in-silico experiments indicate that T s predicts the turbulent scalar flux based on the surface-renewal approximation in a satisfactory manner. It is also elucidated that the proportionality constant for T s is approximately 0.36, which is very close to that found by the laboratory experiments. This fact shows that the time scale T s appears to be essentially the same as the time scale

  11. Development of a three dimensional homogeneous calculation model for the BFS-62 critical experiment. Preparation of adjusted equivalent measured values for sodium void reactivity values. Final report

    International Nuclear Information System (INIS)

    Manturov, G.; Semenov, M.; Seregin, A.; Lykova, L.

    2004-01-01

    The BFS-62 critical experiments are currently used as 'benchmark' for verification of IPPE codes and nuclear data, which have been used in the study of loading a significant amount of Pu in fast reactors. The BFS-62 experiments have been performed at BFS-2 critical facility of IPPE (Obninsk). The experimental program has been arranged in such a way that the effect of replacement of uranium dioxied blanket by the steel reflector as well as the effect of replacing UOX by MOX on the main characteristics of the reactor model was studied. Wide experimental program, including measurements of the criticality-keff, spectral indices, radial and axial fission rate distributions, control rod mock-up worth, sodium void reactivity effect SVRE and some other important nuclear physics parameters, was fulfilled in the core. Series of 4 BFS-62 critical assemblies have been designed for studying the changes in BN-600 reactor physics from existing state to hybrid core. All the assemblies are modeling the reactor state prior to refueling, i.e. with all control rod mock-ups withdrawn from the core. The following items are chosen for the analysis in this report: Description of the critical assembly BFS-62-3A as the 3rd assembly in a series of 4 BFS critical assemblies studying BN-600 reactor with MOX-UOX hybrid zone and steel reflector; Development of a 3D homogeneous calculation model for the BFS-62-3A critical experiment as the mock-up of BN-600 reactor with hybrid zone and steel reflector; Evaluation of measured nuclear physics parameters keff and SVRE (sodium void reactivity effect); Preparation of adjusted equivalent measured values for keff and SVRE. Main series of calculations are performed using 3D HEX-Z diffusion code TRIGEX in 26 groups, with the ABBN-93 cross-section set. In addition, precise calculations are made, in 299 groups and Ps-approximation in scattering, by Monte-Carlo code MMKKENO and discrete ordinate code TWODANT. All calculations are based on the common system

  12. Homogeneous instantons in bigravity

    International Nuclear Information System (INIS)

    Zhang, Ying-li; Sasaki, Misao; Yeom, Dong-han

    2015-01-01

    We study homogeneous gravitational instantons, conventionally called the Hawking-Moss (HM) instantons, in bigravity theory. The HM instantons describe the amplitude of quantum tunneling from a false vacuum to the true vacuum. Corrections to General Relativity (GR) are found in a closed form. Using the result, we discuss the following two issues: reduction to the de Rham-Gabadadze-Tolley (dRGT) massive gravity and the possibility of preference for a large e-folding number in the context of the Hartle-Hawking (HH) no-boundary proposal. In particular, concerning the dRGT limit, it is found that the tunneling through the so-called self-accelerating branch is exponentially suppressed relative to the normal branch, and the probability becomes zero in the dRGT limit. As far as HM instantons are concerned, this could imply that the reduction from bigravity to the dRGT massive gravity is ill-defined.

  13. Fully electric waste collection

    CERN Multimedia

    Anaïs Schaeffer

    2015-01-01

    Since 15 June, Transvoirie, which provides waste collection services throughout French-speaking Switzerland, has been using a fully electric lorry for its collections on the CERN site – a first for the region!   Featuring a motor powered by electric batteries that charge up when the brakes are used, the new lorry that roams the CERN site is as green as can be. And it’s not only the motor that’s electric: its waste compactor and lifting mechanism are also electrically powered*, making it the first 100% electric waste collection vehicle in French-speaking Switzerland. Considering that a total of 15.5 tonnes of household waste and paper/cardboard are collected each week from the Meyrin and Prévessin sites, the benefits for the environment are clear. This improvement comes as part of CERN’s contract with Transvoirie, which stipulates that the firm must propose ways of becoming more environmentally friendly (at no extra cost to CERN). *The was...

  14. New Development in Selective Laser Melting of Ti-6Al-4V: A Wider Processing Window for the Achievement of Fully Lamellar α + β Microstructures

    Science.gov (United States)

    Lui, E. W.; Xu, W.; Pateras, A.; Qian, M.; Brandt, M.

    2017-12-01

    Recent progress has shown that Ti-6Al-4V fabricated by selective laser melting (SLM) can achieve a fully lamellar α + β microstructure using 60 µm layer thickness in the as-built state via in situ martensite decomposition by manipulating the processing parameters. The potential to broaden the processing window was explored in this study by increasing the layer thickness to the less commonly used 90 µm. Fully lamellar α + β microstructures were produced in the as-built state using inter-layer times in the range of 1-12 s. Microstructural features such as the α-lath thickness and morphology were sensitive to both build height and inter-layer time. The α-laths produced using the inter-layer time of 1 s were much coarser than those produced with the inter-layer time of 12 s. The fine fully lamellar α + β structure resulted in tensile ductility of 11% and yield strength of 980 MPa. The tensile properties can be further improved by minimizing the presence of process-induced defects.

  15. Homogenization Issues in the Combustion of Heterogeneous Solid Propellants

    Science.gov (United States)

    Chen, M.; Buckmaster, J.; Jackson, T. L.; Massa, L.

    2002-01-01

    We examine random packs of discs or spheres, models for ammonium-perchlorate-in-binder propellants, and discuss their average properties. An analytical strategy is described for calculating the mean or effective heat conduction coefficient in terms of the heat conduction coefficients of the individual components, and the results are verified by comparison with those of direct numerical simulations (dns) for both 2-D (disc) and 3-D (sphere) packs across which a temperature difference is applied. Similarly, when the surface regression speed of each component is related to the surface temperature via a simple Arrhenius law, an analytical strategy is developed for calculating an effective Arrhenius law for the combination, and these results are verified using dns in which a uniform heat flux is applied to the pack surface, causing it to regress. These results are needed for homogenization strategies necessary for fully integrated 2-D or 3-D simulations of heterogeneous propellant combustion.

  16. The relationship between continuum homogeneity and statistical homogeneity in cosmology

    International Nuclear Information System (INIS)

    Stoeger, W.R.; Ellis, G.F.R.; Hellaby, C.

    1987-01-01

    Although the standard Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe models are based on the concept that the Universe is spatially homogeneous, up to the present time no definition of this concept has been proposed that could in principle be tested by observation. Such a definition is here proposed, based on a simple spatial averaging procedure, which relates observable properties of the Universe to the continuum homogeneity idea that underlies the FLRW models. It turns out that the statistical homogeneity often used to describe the distribution of matter on a large scale does not imply spatial homogeneity according to this definition, and so cannot be simply related to a FLRW Universe model. Values are proposed for the homogeneity parameter and length scale of homogeneity of the Universe. (author)

  17. The development of a fully-integrated immune response model (FIRM) simulator of the immune response through integration of multiple subset models.

    Science.gov (United States)

    Palsson, Sirus; Hickling, Timothy P; Bradshaw-Pierce, Erica L; Zager, Michael; Jooss, Karin; O'Brien, Peter J; Spilker, Mary E; Palsson, Bernhard O; Vicini, Paolo

    2013-09-28

    The complexity and multiscale nature of the mammalian immune response provides an excellent test bed for the potential of mathematical modeling and simulation to facilitate mechanistic understanding. Historically, mathematical models of the immune response focused on subsets of the immune system and/or specific aspects of the response. Mathematical models have been developed for the humoral side of the immune response, or for the cellular side, or for cytokine kinetics, but rarely have they been proposed to encompass the overall system complexity. We propose here a framework for integration of subset models, based on a system biology approach. A dynamic simulator, the Fully-integrated Immune Response Model (FIRM), was built in a stepwise fashion by integrating published subset models and adding novel features. The approach used to build the model includes the formulation of the network of interacting species and the subsequent introduction of rate laws to describe each biological process. The resulting model represents a multi-organ structure, comprised of the target organ where the immune response takes place, circulating blood, lymphoid T, and lymphoid B tissue. The cell types accounted for include macrophages, a few T-cell lineages (cytotoxic, regulatory, helper 1, and helper 2), and B-cell activation to plasma cells. Four different cytokines were accounted for: IFN-γ, IL-4, IL-10 and IL-12. In addition, generic inflammatory signals are used to represent the kinetics of IL-1, IL-2, and TGF-β. Cell recruitment, differentiation, replication, apoptosis and migration are described as appropriate for the different cell types. The model is a hybrid structure containing information from several mammalian species. The structure of the network was built to be physiologically and biochemically consistent. Rate laws for all the cellular fate processes, growth factor production rates and half-lives, together with antibody production rates and half-lives, are provided. The

  18. The specific features of self-action of high-power laser radiation propagating through a fully ionised cold plasma and the development of modulation instability

    International Nuclear Information System (INIS)

    Aleshkevich, Viktor A; Kartashev, Ya V; Vysloukh, Victor A

    2000-01-01

    The specific features of the propagation of soliton-like light beams through a fully ionised two-dimensional cold plasma are considered employing analytical and numerical methods commonly used in nonlinear optics. Exact soliton profiles for the lower and upper soliton branches are found numerically in the presence of optical bistability. It is shown that the interaction of incoherent soliton-like laser beams in such a plasma may result both in the destruction of one of the beams and in production of new ones. The regime of the modulation instability of a plane wave propagating through a cold laser-produced plasma is studied. (nonlinear optical phenomena)

  19. Homogenization of linearly anisotropic scattering cross sections in a consistent B1 heterogeneous leakage model

    International Nuclear Information System (INIS)

    Marleau, G.; Debos, E.

    1998-01-01

    One of the main problems encountered in cell calculations is that of spatial homogenization where one associates to an heterogeneous cell an homogeneous set of cross sections. The homogenization process is in fact trivial when a totally reflected cell without leakage is fully homogenized since it involved only a flux-volume weighting of the isotropic cross sections. When anisotropic leakages models are considered, in addition to homogenizing isotropic cross sections, the anisotropic scattering cross section must also be considered. The simple option, which consists of using the same homogenization procedure for both the isotropic and anisotropic components of the scattering cross section, leads to inconsistencies between the homogeneous and homogenized transport equation. Here we will present a method for homogenizing the anisotropic scattering cross sections that will resolve these inconsistencies. (author)

  20. Homogenization of resonant chiral metamaterials

    OpenAIRE

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten; Malureanu, Radu; Lederer, Falk; Lavrinenko, Andrei

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as e.g. propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size a critical density exists above which increasing coupling between neighboring meta-atoms prevails a reasonable homogenization. On the contrary, a dilution in excess will induce features reminiscent to pho...

  1. Bilipschitz embedding of homogeneous fractals

    OpenAIRE

    Lü, Fan; Lou, Man-Li; Wen, Zhi-Ying; Xi, Li-Feng

    2014-01-01

    In this paper, we introduce a class of fractals named homogeneous sets based on some measure versions of homogeneity, uniform perfectness and doubling. This fractal class includes all Ahlfors-David regular sets, but most of them are irregular in the sense that they may have different Hausdorff dimensions and packing dimensions. Using Moran sets as main tool, we study the dimensions, bilipschitz embedding and quasi-Lipschitz equivalence of homogeneous fractals.

  2. Homogeneous versus heterogeneous zeolite nucleation

    NARCIS (Netherlands)

    Dokter, W.H.; Garderen, van H.F.; Beelen, T.P.M.; Santen, van R.A.; Bras, W.

    1995-01-01

    Aggregates of fractal dimension were found in the intermediate gel phases that organize prior to nucleation and crystallization (shown right) of silicalite from a homogeneous reaction mixture. Small- and wide-angle X-ray scattering studies prove that for zeolites nucleation may be homogeneous or

  3. Homogenization theory in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1986-02-01

    The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr

  4. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  5. Homogeneous deuterium exchange using rhenium and platinum chloride catalysts

    International Nuclear Information System (INIS)

    Fawdry, R.M.

    1979-01-01

    Previous studies of homogeneous hydrogen isotope exchange are mostly confined to one catalyst, the tetrachloroplatinite salt. Recent reports have indicated that chloride salts of iridium and rhodium may also be homogeneous exchange catalysts similar to the tetrachloroplatinite, but with much lower activities. Exchange by these homogeneous catalysts is frequently accompanied by metal precipitation with the termination of homogeneous exchange, particularly in the case of alkane exchange. The studies presented in this thesis describe two different approaches to overcome this limitation of homogeneous hydrogen isotope exchange catalysts. The first approach was to improve the stability of an existing homogeneous catalyst and the second was to develop a new homogeneous exchange catalyst which is free of the instability limitation

  6. Multipartite fully nonlocal quantum states

    International Nuclear Information System (INIS)

    Almeida, Mafalda L.; Cavalcanti, Daniel; Scarani, Valerio; Acin, Antonio

    2010-01-01

    We present a general method for characterizing the quantum correlations obtained after local measurements on multipartite systems. Sufficient conditions for a quantum system to be fully nonlocal according to a given partition, as well as being (genuinely) multipartite fully nonlocal, are derived. These conditions allow us to identify all completely connected graph states as multipartite fully nonlocal quantum states. Moreover, we show that this feature can also be observed in mixed states: the tensor product of five copies of the Smolin state, a biseparable and bound entangled state, is multipartite fully nonlocal.

  7. Operational flood-forecasting in the Piemonte region – development and verification of a fully distributed physically-oriented hydrological model

    Directory of Open Access Journals (Sweden)

    D. Rabuffetti

    2009-03-01

    Full Text Available A hydrological model for real time flood forecasting to Civil Protection services requires reliability and rapidity. At present, computational capabilities overcome the rapidity needs even when a fully distributed hydrological model is adopted for a large river catchment as the Upper Po river basin closed at Ponte Becca (nearly 40 000 km2. This approach allows simulating the whole domain and obtaining the responses of large as well as of medium and little sized sub-catchments. The FEST-WB hydrological model (Mancini, 1990; Montaldo et al., 2007; Rabuffetti et al., 2008 is implemented. The calibration and verification activities are based on more than 100 flood events, occurred along the main tributaries of the Po river in the period 2000–2003. More than 300 meteorological stations are used to obtain the forcing fields, 10 cross sections with continuous and reliable discharge time series are used for calibration while verification is performed on about 40 monitored cross sections. Furthermore meteorological forecasting models are used to force the hydrological model with Quantitative Precipitation Forecasts (QPFs for 36 h horizon in "operational setting" experiments. Particular care is devoted to understanding how QPF affects the accuracy of the Quantitative Discharge Forecasts (QDFs and to assessing the QDF uncertainty impact on the warning system reliability. Results are presented either in terms of QDF and of warning issues highlighting the importance of an "operational based" verification approach.

  8. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  9. Homogeneous Spaces and Equivariant Embeddings

    CERN Document Server

    Timashev, DA

    2011-01-01

    Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em

  10. Fully Depleted Charge-Coupled Devices

    International Nuclear Information System (INIS)

    Holland, Stephen E.

    2006-01-01

    We have developed fully depleted, back-illuminated CCDs that build upon earlier research and development efforts directed towards technology development of silicon-strip detectors used in high-energy-physics experiments. The CCDs are fabricated on the same type of high-resistivity, float-zone-refined silicon that is used for strip detectors. The use of high-resistivity substrates allows for thick depletion regions, on the order of 200-300 um, with corresponding high detection efficiency for near-infrared and soft x-ray photons. We compare the fully depleted CCD to the p-i-n diode upon which it is based, and describe the use of fully depleted CCDs in astronomical and x-ray imaging applications

  11. Qualitative analysis of homogeneous universes

    International Nuclear Information System (INIS)

    Novello, M.; Araujo, R.A.

    1980-01-01

    The qualitative behaviour of cosmological models is investigated in two cases: Homogeneous and isotropic Universes containing viscous fluids in a stokesian non-linear regime; Rotating expanding universes in a state which matter is off thermal equilibrium. (Author) [pt

  12. A second stage homogenization method

    International Nuclear Information System (INIS)

    Makai, M.

    1981-01-01

    A second homogenization is needed before the diffusion calculation of the core of large reactors. Such a second stage homogenization is outlined here. Our starting point is the Floquet theorem for it states that the diffusion equation for a periodic core always has a particular solution of the form esup(j)sup(B)sup(x) u (x). It is pointed out that the perturbation series expansion of function u can be derived by solving eigenvalue problems and the eigenvalues serve to define homogenized cross sections. With the help of these eigenvalues a homogenized diffusion equation can be derived the solution of which is cos Bx, the macroflux. It is shown that the flux can be expressed as a series of buckling. The leading term in this series is the well known Wigner-Seitz formula. Finally three examples are given: periodic absorption, a cell with an absorber pin in the cell centre, and a cell of three regions. (orig.)

  13. Homogenization methods for heterogeneous assemblies

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1980-01-01

    The third session of the IAEA Technical Committee Meeting is concerned with the problem of homogenization of heterogeneous assemblies. Six papers will be presented on the theory of homogenization and on practical procedures for deriving homogenized group cross sections and diffusion coefficients. That the problem of finding so-called ''equivalent'' diffusion theory parameters for the use in global reactor calculations is of great practical importance. In spite of this, it is fair to say that the present state of the theory of second homogenization is far from being satisfactory. In fact, there is not even a uniquely accepted approach to the problem of deriving equivalent group diffusion parameters. Common agreement exists only about the fact that the conventional flux-weighting technique provides only a first approximation, which might lead to acceptable results in certain cases, but certainly does not guarantee the basic requirement of conservation of reaction rates

  14. A personal view on homogenization

    International Nuclear Information System (INIS)

    Tartar, L.

    1987-02-01

    The evolution of some ideas is first described. Under the name homogenization are collected all the mathematical results who help understanding the relations between the microstructure of a material and its macroscopic properties. Homogenization results are given through a critically detailed bibliography. The mathematical models given are systems of partial differential equations, supposed to describe some properties at a scale ε and we want to understand what will happen to the solutions if ε tends to 0

  15. Numerical Analysis of Fully Developed Laminar Flow in Trapezoidal and Sinusoidal Grooves with Shear Stress at the Liquid-Vapor Interface

    National Research Council Canada - National Science Library

    Thomas, Scott

    2000-01-01

    .... A computer model was developed using a finite difference solution which finds the mean velocity, Poiseuille number, and volumetric flow rate in terms of the groove geometry, meniscus contact angle...

  16. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.

    2015-04-16

    Modeling porous flow in complex media is a challenging problem. Not only is the problem inherently multiscale but, due to high contrast in permeability values, flow velocities may differ greatly throughout the medium. To avoid complicated interface conditions, the Brinkman model is often used for such flows [O. Iliev, R. Lazarov, and J. Willems, Multiscale Model. Simul., 9 (2011), pp. 1350--1372]. Instead of permeability variations and contrast being contained in the geometric media structure, this information is contained in a highly varying and high-contrast coefficient. In this work, we present two main contributions. First, we develop a novel homogenization procedure for the high-contrast Brinkman equations by constructing correctors and carefully estimating the residuals. Understanding the relationship between scales and contrast values is critical to obtaining useful estimates. Therefore, standard convergence-based homogenization techniques [G. A. Chechkin, A. L. Piatniski, and A. S. Shamev, Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point is that the Brinkman equations, in certain scaling regimes, are invariant under homogenization. Unlike in the case of Stokes-to-Darcy homogenization [D. Brown, P. Popov, and Y. Efendiev, GEM Int. J. Geomath., 2 (2011), pp. 281--305, E. Marusic-Paloka and A. Mikelic, Boll. Un. Mat. Ital. A (7), 10 (1996), pp. 661--671], the results presented here under certain velocity regimes yield a Brinkman-to-Brinkman upscaling that allows using a single software platform to compute on both microscales and macroscales. In this paper, we discuss the homogenized Brinkman equations. We derive auxiliary cell problems to build correctors and calculate effective coefficients for certain velocity regimes. Due to the boundary effects, we construct

  17. Specifications for the development of a fully three-dimensional numerical groundwater model for regional mass transport of radionuclides from a deep waste repository

    Energy Technology Data Exchange (ETDEWEB)

    Prickett, T.A.

    1980-04-01

    Specifications are given which are necessary to develop a three-dimensional numerical model capable of simulating regional mass transport of radionuclides from a deep waste repository. The model to be developed will include all of the significant mass transport processes including flow, chemical, and thermal advection, mechanical dispersion, molecular diffusion, ion exchange reactions, and radioactive decay. The model specifications also include that density and viscosity fluid properties be functions of pressure, temperature, and concentration and take into account fluid and geologic heterogenieties by allowing possible assignment of individual values to every block of the model. The model specifications furthermore include the repository shape, input/output information, boundary conditions, and the need for documentation and a user's manual. Model code validation can be accomplished with the included known analytical or laboratory solutions. It is recommended that an existing finite-difference model (developed by INTERCOMP and INTERA, Inc.) be used as a starting point either as an acceptable basic code for modification or as a pattern for the development of a completely different numerical scheme. A ten-step plan is given to outline the general procedure for development of the code.

  18. Specifications for the development of a fully three-dimensional numerical groundwater model for regional mass transport of radionuclides from a deep waste repository

    International Nuclear Information System (INIS)

    Prickett, T.A.

    1980-04-01

    Specifications are given which are necessary to develop a three-dimensional numerical model capable of simulating regional mass transport of radionuclides from a deep waste repository. The model to be developed will include all of the significant mass transport processes including flow, chemical, and thermal advection, mechanical dispersion, molecular diffusion, ion exchange reactions, and radioactive decay. The model specifications also include that density and viscosity fluid properties be functions of pressure, temperature, and concentration and take into account fluid and geologic heterogenieties by allowing possible assignment of individual values to every block of the model. The model specifications furthermore include the repository shape, input/output information, boundary conditions, and the need for documentation and a user's manual. Model code validation can be accomplished with the included known analytical or laboratory solutions. It is recommended that an existing finite-difference model (developed by INTERCOMP and INTERA, Inc.) be used as a starting point either as an acceptable basic code for modification or as a pattern for the development of a completely different numerical scheme. A ten-step plan is given to outline the general procedure for development of the code

  19. Fully Employing Software Inspections Data

    Science.gov (United States)

    Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally

    2009-01-01

    Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.

  20. 7 CFR 58.920 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.920 Section 58.920 Agriculture... Procedures § 58.920 Homogenization. Where applicable concentrated products shall be homogenized for the... homogenization and the pressure at which homogenization is accomplished will be that which accomplishes the most...

  1. Automation in trace-element chemistry - Development of a fully automated on-line preconcentration device for trace analysis of heavy metals with atomic spectroscopy

    International Nuclear Information System (INIS)

    Michaelis, M.R.A.

    1990-01-01

    Scope of this work was the development of an automated system for trace element preconcentration to be used and integrated to analytic atomic spectroscopic methods like flame atomic absorption spectrometry (FAAS), graphite furnace atomic absorption spectrometry (GFAAS) or atomic emission spectroscopy with inductively coupled plasma (ICP-AES). Based on the newly developed cellulose-based chelating cation exchangers ethylene-diamin-triacetic acid cellulose (EDTrA-Cellulose) and sulfonated-oxine cellulose a flexible, computer-controlled instrument for automation of preconcentration and/or of matrix separation of heavy metals is described. The most important properties of these materials are fast exchange kinetics, good selectivity against alkaline and alkaline earth elements, good flow characteristics and good stability of the material and the chelating functions against changes in pH-values of reagents necessary in the process. The combination of the preconcentration device for on-line determinations of Zn, Cd, Pb, Ni, Fe, Co, Mn, V, Cu, La, U, Th is described for FAAS and for ICP-AES with a simultaneous spectrometer. Signal enhancement factors of 70 are achieved from preconcentration of 10 ml and on-line determination with FAAS due to signal quantification in peak-height mode. For GFAAS and for sequential ICP methods for off-line preconcentration are given. The optimization and adaption of the interface to the different characteristics of the analytical instrumentation is emphasized. For evaluation and future developments with respect to determination and/or preconcentration of anionic species like As, Se, Sb etc. instrument modifications are proposed and a development software is described. (Author)

  2. A generalized model for homogenized reflectors

    International Nuclear Information System (INIS)

    Pogosbekyan, Leonid; Kim, Yeong Il; Kim, Young Jin; Joo, Hyung Kook

    1996-01-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The method of K. Smith can be simulated within framework of new method, while the new method approximates hetero-geneous cell better in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are:improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith's approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b)control blades simulation; (c) mixed UO 2 /MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANBOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions

  3. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  4. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  5. A new concept of equivalent homogenization method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Pogoskekyan, Leonid; Kim, Young Il; Ju, Hyung Kook; Chang, Moon Hee [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The offered concept covers both those of K. Koebke and K. Smith; both of them can be simulated within framework of new concept. Also, the offered concept covers Siemens KWU approach for baffle/reflector simulation, where the equivalent homogenized reflector XS are derived from the conservation of response matrix at the interface in 1D simi-infinite slab geometry. The IM and XS of new concept satisfy the same assumption about response matrix conservation in 1D semi-infinite slab geometry. It is expected that the new concept provides more accurate approximation of heterogeneous cell, especially in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are: improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith`s approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b) control blades simulation; (c) mixed UO{sub 2}/MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANDOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions. 9 figs., 7 refs. (Author).

  6. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  7. Smooth homogeneous structures in operator theory

    CERN Document Server

    Beltita, Daniel

    2005-01-01

    Geometric ideas and techniques play an important role in operator theory and the theory of operator algebras. Smooth Homogeneous Structures in Operator Theory builds the background needed to understand this circle of ideas and reports on recent developments in this fruitful field of research. Requiring only a moderate familiarity with functional analysis and general topology, the author begins with an introduction to infinite dimensional Lie theory with emphasis on the relationship between Lie groups and Lie algebras. A detailed examination of smooth homogeneous spaces follows. This study is illustrated by familiar examples from operator theory and develops methods that allow endowing such spaces with structures of complex manifolds. The final section of the book explores equivariant monotone operators and Kähler structures. It examines certain symmetry properties of abstract reproducing kernels and arrives at a very general version of the construction of restricted Grassmann manifolds from the theory of loo...

  8. A comparative histologic study on furcal perforation repair with Root MTA and Pro Root MTA in fully developed teeth in dog

    Directory of Open Access Journals (Sweden)

    Rahimi S.

    2005-07-01

    Full Text Available Background and Aim: The goal of endodontics is to seal the root canal system from the orifice to apical constriction completely and tridimensionally.Hence perforations during root canal therapy, because of caries or resorptions must be sealed and obturated with ideal materials. The aim of this study was to histologically compare two kinds of mineral trioxide aggregate Root MTA and Pro Root MTA for furcal perforation repair in developed teeth in dog. Materials and Methods: In this experimental study, thirty teeth consisting of second, third and fourth mandibular premolars of five German shepherd dogs were selected. Twenty-four teeth were randomly divided into four experimental groups (6 teeth each. One pair of Root MTA and Pro Root MTA groups studied in one month and the other in three months intervals. Positive and negative control groups was each contained three teeth. In positive control group, perforations were not treated and negative control group contained intact teeth. In experimental groups perforations repaired after one week exposure to oral cavity with Root MTA or Pro Root MTA. After time intervals animals were subjected to vital perfusion and 6 m histologic sections were prepared. Inflammation and hard tissue formation were ranked by Cox criteria. Data were analysed using Mann-Whitney and Chi-Square statistical tests with P0.05. Conclusion: Mineral Trioxide Aggregate is an adequate material for furcal perforation repair in dog’s teeth. Root MTA could be a good substitute for Pro Root MTA considering the lower cost and similar characteristics.

  9. Fully Integrated Biochip Platforms for Advanced Healthcare

    Directory of Open Access Journals (Sweden)

    Giovanni De Micheli

    2012-08-01

    Full Text Available Recent advances in microelectronics and biosensors are enabling developments of innovative biochips for advanced healthcare by providing fully integrated platforms for continuous monitoring of a large set of human disease biomarkers. Continuous monitoring of several human metabolites can be addressed by using fully integrated and minimally invasive devices located in the sub-cutis, typically in the peritoneal region. This extends the techniques of continuous monitoring of glucose currently being pursued with diabetic patients. However, several issues have to be considered in order to succeed in developing fully integrated and minimally invasive implantable devices. These innovative devices require a high-degree of integration, minimal invasive surgery, long-term biocompatibility, security and privacy in data transmission, high reliability, high reproducibility, high specificity, low detection limit and high sensitivity. Recent advances in the field have already proposed possible solutions for several of these issues. The aim of the present paper is to present a broad spectrum of recent results and to propose future directions of development in order to obtain fully implantable systems for the continuous monitoring of the human metabolism in advanced healthcare applications.

  10. Fully integrated biochip platforms for advanced healthcare.

    Science.gov (United States)

    Carrara, Sandro; Ghoreishizadeh, Sara; Olivo, Jacopo; Taurino, Irene; Baj-Rossi, Camilla; Cavallini, Andrea; de Beeck, Maaike Op; Dehollain, Catherine; Burleson, Wayne; Moussy, Francis Gabriel; Guiseppi-Elie, Anthony; De Micheli, Giovanni

    2012-01-01

    Recent advances in microelectronics and biosensors are enabling developments of innovative biochips for advanced healthcare by providing fully integrated platforms for continuous monitoring of a large set of human disease biomarkers. Continuous monitoring of several human metabolites can be addressed by using fully integrated and minimally invasive devices located in the sub-cutis, typically in the peritoneal region. This extends the techniques of continuous monitoring of glucose currently being pursued with diabetic patients. However, several issues have to be considered in order to succeed in developing fully integrated and minimally invasive implantable devices. These innovative devices require a high-degree of integration, minimal invasive surgery, long-term biocompatibility, security and privacy in data transmission, high reliability, high reproducibility, high specificity, low detection limit and high sensitivity. Recent advances in the field have already proposed possible solutions for several of these issues. The aim of the present paper is to present a broad spectrum of recent results and to propose future directions of development in order to obtain fully implantable systems for the continuous monitoring of the human metabolism in advanced healthcare applications.

  11. Development of salt and pH-induced solidified floating organic droplets homogeneous liquid-liquid microextraction for extraction of ten pyrethroid insecticides in fresh fruits and fruit juices followed by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Torbati, Mohammadali; Farajzadeh, Mir Ali; Torbati, Mostafa; Nabil, Ali Akbar Alizadeh; Mohebbi, Ali; Afshar Mogaddam, Mohammad Reza

    2018-01-01

    A new microextraction method named salt and pH-induced homogeneous liquid-liquid microextraction has been developed in a home-made extraction device for the extraction and preconcentration of some pyrethroid insecticides from different fruit juice samples prior to gas chromatography-mass spectrometry. In the present work, an extraction device made from two parallel glass tubes with different lengths and diameters was used in the microextraction procedure. In this method, a homogeneous solution of a sample solution and an extraction solvent (pivalic acid) was broken by performing an acid-base reaction and the extraction solvent was produced in whole of the solution. The produced droplets of the extraction solvent went up through the solution and solidified using an ice-bath. They were collected without centrifugation step. Under the optimum conditions, limits of detection and quantification were obtained in the ranges of 0.006-0.038, and 0.023-0.134ngmL -1 , respectively. The enrichment factors and extraction recoveries of the selected analytes ranged from 365-460 to 73-92%, respectively. The relative standard deviations were lower than 9% for intra- (n = 6) and inter-day (n = 4) precisions at a concentration of 1ngmL -1 of each analyte. Finally, some fruit juice samples were effectively analyzed by the proposed method. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Physics of fully ionized regions

    International Nuclear Information System (INIS)

    Flower, D.

    1975-01-01

    In this paper the term fully ionised regions is taken to embrace both planetary nebulae and the so-called 'H II' regions referred to as H + regions. Whilst these two types of gaseous nebulae are very different from an evolutionary standpoint, they are physically very similar, being characterised by photoionisation of a low-density plasma by a hot star. (Auth.)

  13. Homogenized group cross sections by Monte Carlo

    International Nuclear Information System (INIS)

    Van Der Marck, S. C.; Kuijper, J. C.; Oppe, J.

    2006-01-01

    Homogenized group cross sections play a large role in making reactor calculations efficient. Because of this significance, many codes exist that can calculate these cross sections based on certain assumptions. However, the application to the High Flux Reactor (HFR) in Petten, the Netherlands, the limitations of such codes imply that the core calculations would become less accurate when using homogenized group cross sections (HGCS). Therefore we developed a method to calculate HGCS based on a Monte Carlo program, for which we chose MCNP. The implementation involves an addition to MCNP, and a set of small executables to perform suitable averaging after the MCNP run(s) have completed. Here we briefly describe the details of the method, and we report on two tests we performed to show the accuracy of the method and its implementation. By now, this method is routinely used in preparation of the cycle to cycle core calculations for HFR. (authors)

  14. Noise level arrangement in determined zones of homogenous development of green areas on the example of the spa park in Inowrocław

    Science.gov (United States)

    Sztubecka, Małgorzata; Skiba, Marta

    2016-11-01

    Noise measurements are usually carried out in developed areas as well as in the surroundings of traffic routes providing basis for actions in order to limit its influence on the neighboring areas. Noise measurements in park areas are rare due to belief that these areas are silent zones. Such attitude cannot be justified. This article aims to the assessment of noise appearing in determined subzones of the spa park in Inowrocław. From the research carried out it can be noticed that traffic noise does not have any important meaning for the acoustic climate of the park. It is the people who stay there who generate more noise. Comparative analysis proves the appearance and penetration of noise from the zones with greater level of noise to the ones with lower amount.

  15. Electro-magnetostatic homogenization of bianisotropic metamaterials

    OpenAIRE

    Fietz, Chris

    2012-01-01

    We apply the method of asymptotic homogenization to metamaterials with microscopically bianisotropic inclusions to calculate a full set of constitutive parameters in the long wavelength limit. Two different implementations of electromagnetic asymptotic homogenization are presented. We test the homogenization procedure on two different metamaterial examples. Finally, the analytical solution for long wavelength homogenization of a one dimensional metamaterial with microscopically bi-isotropic i...

  16. Achieving ultrafine grained and homogeneous AA1050/ZnO nanocomposite with well-developed high angle grain boundaries through accumulative press bonding

    Energy Technology Data Exchange (ETDEWEB)

    Amirkhanlou, Sajjad, E-mail: s.amirkhanlou@aut.ac.ir [Young Researchers and Elite Club, Najafabad Branch, Islamic Azad University, Najafabad (Iran, Islamic Republic of); Ketabchi, Mostafa; Parvin, Nader; Askarian, Masoomeh [Department of Mining and Metallurgical Engineering, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Carreño, Fernando [Department of Physical Metallurgy, CENIM-CSIC, Av. Gregorio del Amo 8, 28040 Madrid (Spain)

    2015-03-11

    Aluminum matrix nanocomposites with 2 vol% ZnO nanoparticles were produced using accumulative press bonding (APB) as a very effective and novel severe plastic deformation process. Microstructural evaluation and mechanical properties of specimens were characterized by field-emission scanning electron microscopy (FE-SEM), scanning transmission electron microscopy (STEM), electron backscatter diffraction (EBSD) and tensile test. Microstructure of AA1050/ZnO nanocomposite showed a uniform distribution of ZnO nanoparticles throughout the aluminum matrix. STEM and EBSD observations revealed that ultrafine-grained Al/ZnO nanocomposite with the average grain size of <500 nm and well-developed high angle grain boundaries (80% high angle boundaries and 37° average misorientation angle) was successfully obtained by performing 14 cycles of the APB process. When the number of APB cycles increased the tensile strength of Al/ZnO nanocomposite improved and reached 228 MPa after 14 cycles, which was 2.6 and 1.3 times greater than the obtained values for annealed (raw material, 88 MPa) and monolithic aluminum (180 MPa), respectively.

  17. Observational homogeneity of the Universe

    International Nuclear Information System (INIS)

    Bonnor, W.B.; Ellis, G.F.R.

    1986-01-01

    A new approach to observational homogeneity is presented. The observation that stars and galaxies in distant regions appear similar to those nearby may be taken to imply that matter has had a similar thermodynamic history in widely separated parts of the Universe (the Postulate of Uniform Thermal Histories, or PUTH). The supposition is now made that similar thermodynamic histories imply similar dynamical histories. Then the distant apparent similarity is evidence for spatial homogeneity of the Universe. General Relativity is used to test this idea, taking a perfect fluid model and implementing PUTH by the condition that the density and entropy per baryon shall be the same function of the proper time along all galaxy world-lines. (author)

  18. Conclusions about homogeneity and devitrification

    International Nuclear Information System (INIS)

    Larche, F.

    1997-01-01

    A lot of experimental data concerning homogeneity and devitrification of R7T7 glass have been published. It appears that: - the crystallization process is very limited, - the interfaces due to bubbles and the container wall favor crystallization locally but the ratio of crystallized volume remains always below a few per cents, and - crystallization has no damaging long-term effects as far as leaching tests can be trusted. (A.C.)

  19. Is charity a homogeneous good?

    OpenAIRE

    Backus, Peter

    2010-01-01

    In this paper I estimate income and price elasticities of donations to six different charitable causes to test the assumption that charity is a homogeneous good. In the US, charitable donations can be deducted from taxable income. This has long been recognized as producing a price, or taxprice, of giving equal to one minus the marginal tax rate faced by the donor. A substantial portion of the economic literature on giving has focused on estimating price and income elasticities of giving as th...

  20. Rainwater drained through fully filled pipes

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, B; Koestel, P

    1989-02-01

    The conventional rainwater drainage system according to DIN 1986 always seems to be a point of problemacy in the building services as far as the occupancy of installation shafts and ducts is at stake. The excavation work and the necessary gravity lines are considered to be expensive. The consideration of the necessary slope complicates the installation additionally. Basing on those considerations, the raindraining system with fully filled pipes has been developed. DIN 1986, edition June 1988, part 1, point 6.1.1 allows to install rainwater pipes operated as planned, fully filled without slope. An enterprise specialised in building services investigated all system laws because only by a hydraulically exact balance, the function of the rainwater drainage system operated by negative and positive pressure can be insured. The results of those investigations are integrated in a computer program developed for this purpose.

  1. Fully NLO Parton Shower in QCD

    International Nuclear Information System (INIS)

    Skrzypek, M.; Jadach, S.; Slawinska, M.; Gituliar, O.; Kusina, A.; Placzek, W.

    2011-01-01

    The project of constructing a complete NLO-level Parton Shower Monte Carlo for the QCD processes developed in IFJ PAN in Krakow is reviewed. Four issues are discussed: (1) the extension of the standard inclusive collinear factorization into a new, fully exclusive scheme; (2) reconstruction of the LO Parton Shower in the new scheme; (3) inclusion of the exclusive NLO corrections into the hard process and (4) inclusion of the exclusive NLO corrections into the evolution (ladder) part. (authors)

  2. To fully exert the important role of natural gas in building a modern energy security system in China: An understanding of China's National l3th Five-Year Plan for Natural Gas Development

    Directory of Open Access Journals (Sweden)

    Zhen Wang

    2017-07-01

    Full Text Available Along with the introduction of 13th Five-Year Plans in succession for natural gas development programmed by governments at all levels and much more attention paid to haze governance by relevant departments, natural gas, as one of the major energy sources, has ushered in a strategic opportunity era. In view of this, based upon China's National 13th Five-Year Plan for Natural Gas Development formulated by the National Development and Reform Commission, the developing trend of natural gas sector was predicted in the period of 13th Five-Year Plan in terms of supply side, demand side, pricing system, infrastructure construction, etc. and some feasible proposals were made on the whole industrial chain. In terms of the supply side, natural gas will be of availability, accessibility, assurance, affordability, and accountability in the production and supply chains. In terms of the demand side, air pollution treatment will indirectly stimulate gas consumption increase. Gas power generation will become the dominant. Natural gas as a transportation fuel will bring a good new opportunity. Thus it is believed that as the present natural gas development is restricted by both gas pricing system and infrastructure construction, further reform should be strengthened to break the barriers of systems and mechanisms; and that due to many uncertainties in the natural gas market, the decisive role of market in the resource allocation should be fully exerted to ensure the main force of natural gas in building a dependable energy strategic system in present and future China.

  3. Axiomatisation of fully probabilistic design

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kroupa, Tomáš

    2012-01-01

    Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf

  4. Electrical conductivity of a fully ionized plasma in a magnetic field

    International Nuclear Information System (INIS)

    Vaucher, B.; Vaclavik, J.; Schneider, H.

    1975-01-01

    In this experimental work the authors have investigated the electrical conductivity of a homogeneous fully ionized plasma in a homogeneous magnetic field. In particular, the conductivity perpendicular to the magnetic field was studied by means of the magnetoacoustic resonance for different values of the parameter ωsub(c)/γsub(ei) where ωsub(c) is the electron cyclotron frequency and γsub(ei) is the collision frequency between electrons and ions. (Auth.)

  5. Heterotic strings on homogeneous spaces

    International Nuclear Information System (INIS)

    Israel, D.; Kounnas, C.; Orlando, D.; Petropoulos, P.M.

    2005-01-01

    We construct heterotic string backgrounds corresponding to families of homogeneous spaces as exact conformal field theories. They contain left cosets of compact groups by their maximal tori supported by NS-NS 2-forms and gauge field fluxes. We give the general formalism and modular-invariant partition functions, then we consider some examples such as SU(2)/U(1)∝S 2 (already described in a previous paper) and the SU(3)/U(1) 2 flag space. As an application we construct new supersymmetric string vacua with magnetic fluxes and a linear dilaton. (Abstract Copyright [2005], Wiley Periodicals, Inc.)

  6. Physics of fully depleted CCDs

    International Nuclear Information System (INIS)

    Holland, S E; Bebek, C J; Kolbe, W F; Lee, J S

    2014-01-01

    In this work we present simple, physics-based models for two effects that have been noted in the fully depleted CCDs that are presently used in the Dark Energy Survey Camera. The first effect is the observation that the point-spread function increases slightly with the signal level. This is explained by considering the effect on charge-carrier diffusion due to the reduction in the magnitude of the channel potential as collected signal charge acts to partially neutralize the fixed charge in the depleted channel. The resulting reduced voltage drop across the carrier drift region decreases the vertical electric field and increases the carrier transit time. The second effect is the observation of low-level, concentric ring patterns seen in uniformly illuminated images. This effect is shown to be most likely due to lateral deflection of charge during the transit of the photo-generated carriers to the potential wells as a result of lateral electric fields. The lateral fields are a result of space charge in the fully depleted substrates arising from resistivity variations inherent to the growth of the high-resistivity silicon used to fabricate the CCDs

  7. Homogenization scheme for acoustic metamaterials

    KAUST Repository

    Yang, Min

    2014-02-26

    We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.

  8. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    Energy Technology Data Exchange (ETDEWEB)

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  9. Economic growth, energy consumption and CO2 emissions in OECD (Organization for Economic Co-operation and Development)'s transport sector: A fully modified bi-directional relationship approach

    International Nuclear Information System (INIS)

    Saboori, Behnaz; Sapri, Maimunah; Baba, Maizan bin

    2014-01-01

    This paper explores the bi-directional long-run relationship between energy consumption in the road transport sector with CO 2 emissions and economic growth in OECD countries. Using time series data from 1960 to 2008 and employing the Fully Modified Ordinary Least Squares cointegration approach, the paper shows positive significant long-run bi-directional relationship between CO 2 emissions and economic growth, road sector energy consumption and economic growth and CO 2 emissions and road sector energy consumption in all the OECD countries. To examine the response of each of the variables to shocks in the value of other variables, the generalized impulse response approach is employed. The response of CO 2 emissions to economic growth is initially positive in most cases but it is relatively shorter when compared to its initial response to the road transport sector energy consumption. Moreover, in most cases, the response of carbon emissions to the road transport sector energy consumption lasts longer than its response to economic growth. This implies that most of the CO 2 emissions from transport come from energy consumption, thus long-run policies related to the efficient use of energy and shifting to biofuel, renewable and nuclear energy can bring major benefits in mitigating GHG (Greenhouse Gas) emissions. - Highlights: • The relationship between GDP, energy and CO 2 in OECD's transport is investigated. • The Fully Modified Ordinary Least Squares cointegration approach was employed. • There is positive long-run bi-directional relationship between the variables. • The response of CO 2 to GDP is shorter than its response to the energy consumption

  10. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    Directory of Open Access Journals (Sweden)

    Papazov Sava P

    2003-12-01

    Full Text Available Abstract Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium.

  11. Monte Carlo simulation of fully Markovian stochastic geometries

    International Nuclear Information System (INIS)

    Lepage, Thibaut; Delaby, Lucie; Malvagi, Fausto; Mazzolo, Alain

    2010-01-01

    The interest in resolving the equation of transport in stochastic media has continued to increase these last years. For binary stochastic media it is often assumed that the geometry is Markovian, which is never the case in usual environments. In the present paper, based on rigorous mathematical theorems, we construct fully two-dimensional Markovian stochastic geometries and we study their main properties. In particular, we determine a percolation threshold p c , equal to 0.586 ± 0.0015 for such geometries. Finally, Monte Carlo simulations are performed through these geometries and the results compared to homogeneous geometries. (author)

  12. Fully implicit kinetic modelling of collisional plasmas

    International Nuclear Information System (INIS)

    Mousseau, V.A.

    1996-05-01

    This dissertation describes a numerical technique, Matrix-Free Newton Krylov, for solving a simplified Vlasov-Fokker-Planck equation. This method is both deterministic and fully implicit, and may not have been a viable option before current developments in numerical methods. Results are presented that indicate the efficiency of the Matrix-Free Newton Krylov method for these fully-coupled, nonlinear integro-differential equations. The use and requirement for advanced differencing is also shown. To this end, implementations of Chang-Cooper differencing and flux limited Quadratic Upstream Interpolation for Convective Kinematics (QUICK) are presented. Results are given for a fully kinetic ion-electron problem with a self consistent electric field calculated from the ion and electron distribution functions. This numerical method, including advanced differencing, provides accurate solutions, which quickly converge on workstation class machines. It is demonstrated that efficient steady-state solutions can be achieved to the non-linear integro-differential equation, obtaining quadratic convergence, without incurring the large memory requirements of an integral operator. Model problems are presented which simulate plasma impinging on a plate with both high and low neutral particle recycling typical of a divertor in a Tokamak device. These model problems demonstrate the performance of the new solution method

  13. Improving homogeneity by dynamic speed limit systems.

    NARCIS (Netherlands)

    Nes, N. van Brandenberg, S. & Twisk, D.A.M.

    2010-01-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12

  14. 7 CFR 58.636 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.636 Section 58.636 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.636 Homogenization. Homogenization of the pasteurized mix shall be accomplished to...

  15. The homogeneous geometries of real hyperbolic space

    DEFF Research Database (Denmark)

    Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis

    We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use...... our analysis to show that the moduli space of homogeneous structures on real hyperbolic space has two connected components....

  16. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  17. Applications of a systematic homogenization theory for nodal diffusion methods

    International Nuclear Information System (INIS)

    Zhang, Hong-bin; Dorning, J.J.

    1992-01-01

    The authors recently have developed a self-consistent and systematic lattice cell and fuel bundle homogenization theory based on a multiple spatial scales asymptotic expansion of the transport equation in the ratio of the mean free path to the reactor characteristics dimension for use with nodal diffusion methods. The mathematical development leads naturally to self-consistent analytical expressions for homogenized diffusion coefficients and cross sections and flux discontinuity factors to be used in nodal diffusion calculations. The expressions for the homogenized nuclear parameters that follow from the systematic homogenization theory (SHT) are different from those for the traditional flux and volume-weighted (FVW) parameters. The calculations summarized here show that the systematic homogenization theory developed recently for nodal diffusion methods yields accurate values for k eff and assembly powers even when compared with the results of a fine mesh transport calculation. Thus, it provides a practical alternative to equivalence theory and GET (Ref. 3) and to simplified equivalence theory, which requires auxiliary fine-mesh calculations for assemblies embedded in a typical environment to determine the discontinuity factors and the equivalent diffusion coefficient for a homogenized assembly

  18. The evaporative vector: Homogeneous systems

    International Nuclear Information System (INIS)

    Klots, C.E.

    1987-05-01

    Molecular beams of van der Waals molecules are the subject of much current research. Among the methods used to form these beams, three-sputtering, laser ablation, and the sonic nozzle expansion of neat gases - yield what are now recognized to be ''warm clusters.'' They contain enough internal energy to undergo a number of first-order processes, in particular that of evaporation. Because of this evaporation and its attendant cooling, the properties of such clusters are time-dependent. The states of matter which can be arrived at via an evaporative vector on a typical laboratory time-scale are discussed. Topics include the (1) temperatures, (2) metastability, (3) phase transitions, (4) kinetic energies of fragmentation, and (5) the expression of magical properties, all for evaporating homogeneous clusters

  19. Osteoarthritic cartilage is more homogeneous than healthy cartilage

    DEFF Research Database (Denmark)

    Qazi, Arish A; Dam, Erik B; Nielsen, Mads

    2007-01-01

    it evolves as a consequence to disease and thereby can be used as a progression biomarker. MATERIALS AND METHODS: A total of 283 right and left knees from 159 subjects aged 21 to 81 years were scanned using a Turbo 3D T1 sequence on a 0.18-T MRI Esaote scanner. The medial compartment of the tibial cartilage...... sheet was segmented using a fully automatic voxel classification scheme based on supervised learning. From the segmented cartilage sheet, homogeneity was quantified by measuring entropy from the distribution of signal intensities inside the compartment. Each knee was examined by radiography...... of the region was evaluated by testing for overfitting. Three different regularization techniques were evaluated for reducing overfitting errors. RESULTS: The P values for separating the different groups based on cartilage homogeneity were 2 x 10(-5) (KL 0 versus KL 1) and 1 x 10(-7) (KL 0 versus KL >0). Using...

  20. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  1. Restaurant No. 1 fully renovated

    CERN Document Server

    2007-01-01

    The Restaurant No. 1 team. After several months of patience and goodwill on the part of our clients, we are delighted to announce that the major renovation work which began in September 2006 has now been completed. From 21 May 2007 we look forward to welcoming you to a completely renovated restaurant area designed with you in mind. The restaurant team wishes to thank all its clients for their patience and loyalty. Particular attention has been paid in the new design to creating a spacious serving area and providing a wider choice of dishes. The new restaurant area has been designed as an open-plan space to enable you to view all the dishes before making your selection and to move around freely from one food access point to another. It comprises user-friendly areas that fully comply with hygiene standards. From now on you will be able to pick and choose to your heart's content. We invite you to try out wok cooking or some other speciality. Or select a pizza or a plate of pasta with a choice of two sauces fr...

  2. Development of on-chip fully automated immunoassay system "μTASWako i30" to measure the changes in glycosylation profiles of alpha-fetoprotein in patients with hepatocellular carcinoma.

    Science.gov (United States)

    Kurosawa, Tatsuo; Watanabe, Mitsuo

    2016-12-01

    Glycosylation profiles significantly change during oncogenesis. Aberrant glycosylation can be used as a cancer biomarker in clinical settings. Different glycoforms can be separately detected using lectin affinity electrophoresis and lectin array-based methods. However, most methodologies and procedures need experienced technique to perform the assays and expertise to interpret the results. To apply glycomarkers for clinical practice, a robust assay system with an easy-to-use workflow is required. Wako's μTASWako i30, a fully automated immunoanalyzer, was developed for in vitro diagnostics based on microfluidic technology. It utilizes the principles of liquid-phase binding assay, where immunoreactions are performed in a liquid phase, and electrokinetic analyte transport assay. Capillary electrophoresis on microfluidic chip has enabled the detection of different glycoform types of alpha-fetoprotein (AFP), a serum biomarker for hepatocellular carcinoma. AFP with altered glycosylation can be separated based on the reactivity to Lens culinaris agglutinin on electrophoresis. The glycoform AFP-L3 was reportedly more specific in hepatocellular carcinoma. This assay system can provide a high sensitivity and rapid results in 9 min. The test results for ratio of AFP-L3 to total AFP using μTASWako i30 are correlated with those of conventional methodology. The μTASWako assay system and the technology can be utilized for glycosylation analysis in the postgenomic era. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A game-theoretic formulation of the homogeneous self-reconfiguration problem

    KAUST Repository

    Pickem, Daniel

    2015-12-15

    In this paper we formulate the homogeneous two- and three-dimensional self-reconfiguration problem over discrete grids as a constrained potential game. We develop a game-theoretic learning algorithm based on the Metropolis-Hastings algorithm that solves the self-reconfiguration problem in a globally optimal fashion. Both a centralized and a fully decentralized algorithm are presented and we show that the only stochastically stable state is the potential function maximizer, i.e. the desired target configuration. These algorithms compute transition probabilities in such a way that even though each agent acts in a self-interested way, the overall collective goal of self-reconfiguration is achieved. Simulation results confirm the feasibility of our approach and show convergence to desired target configurations.

  4. A game-theoretic formulation of the homogeneous self-reconfiguration problem

    KAUST Repository

    Pickem, Daniel; Egerstedt, Magnus; Shamma, Jeff S.

    2015-01-01

    In this paper we formulate the homogeneous two- and three-dimensional self-reconfiguration problem over discrete grids as a constrained potential game. We develop a game-theoretic learning algorithm based on the Metropolis-Hastings algorithm that solves the self-reconfiguration problem in a globally optimal fashion. Both a centralized and a fully decentralized algorithm are presented and we show that the only stochastically stable state is the potential function maximizer, i.e. the desired target configuration. These algorithms compute transition probabilities in such a way that even though each agent acts in a self-interested way, the overall collective goal of self-reconfiguration is achieved. Simulation results confirm the feasibility of our approach and show convergence to desired target configurations.

  5. Homogenization of the lipid profile values.

    Science.gov (United States)

    Pedro-Botet, Juan; Rodríguez-Padial, Luis; Brotons, Carlos; Esteban-Salán, Margarita; García-Lerín, Aurora; Pintó, Xavier; Lekuona, Iñaki; Ordóñez-Llanos, Jordi

    Analytical reports from the clinical laboratory are essential to guide clinicians about what lipid profile values should be considered altered and, therefore, require intervention. Unfortunately, there is a great heterogeneity in the lipid values reported as "normal, desirable, recommended or referenced" by clinical laboratories. This can difficult clinical decisions and be a barrier to achieve the therapeutic goals for cardiovascular prevention. A recent international recommendation has added a new heterogeneity factor for the interpretation of lipid profile, such as the possibility of measuring it without previous fasting. All this justifies the need to develop a document that adapts the existing knowledge to the clinical practice of our health system. In this regard, professionals from different scientific societies involved in the measurement and use of lipid profile data have developed this document to establish recommendations that facilitate their homogenization. Copyright © 2017. Publicado por Elsevier España, S.L.U.

  6. Method of the characteristics for calculation of VVER without homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Suslov, I.R.; Komlev, O.G.; Novikova, N.N.; Zemskov, E.A.; Tormyshev, I.V.; Melnikov, K.G.; Sidorov, E.B. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    2005-07-01

    The first stage of the development of characteristics code MCCG3D for calculation of the VVER-type reactor without homogenization is presented. The parallel version of the code for MPI was developed and tested on cluster PC with LINUX-OS. Further development of the MCCG3D code for design-level calculations with full-scale space-distributed feedbacks is discussed. For validation of the MCCG3D code we use the critical assembly VENUS-2. The geometrical models with and without homogenization have been used. With both models the MCCG3D results agree well with the experimental power distribution and with results generated by the other codes, but model without homogenization provides better results. The perturbation theory for MCCG3D code is developed and implemented in the module KEFSFGG. The calculations with KEFSFGG are in good agreement with direct calculations. (authors)

  7. Fully CMOS-compatible titanium nitride nanoantennas

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, Justin A., E-mail: jabriggs@stanford.edu [Department of Applied Physics, Stanford University, 348 Via Pueblo Mall, Stanford, California 94305 (United States); Department of Materials Science and Engineering, Stanford University, 496 Lomita Mall, Stanford, California 94305 (United States); Naik, Gururaj V.; Baum, Brian K.; Dionne, Jennifer A. [Department of Materials Science and Engineering, Stanford University, 496 Lomita Mall, Stanford, California 94305 (United States); Petach, Trevor A.; Goldhaber-Gordon, David [Department of Physics, Stanford University, 382 Via Pueblo Mall, Stanford, California 94305 (United States)

    2016-02-01

    CMOS-compatible fabrication of plasmonic materials and devices will accelerate the development of integrated nanophotonics for information processing applications. Using low-temperature plasma-enhanced atomic layer deposition (PEALD), we develop a recipe for fully CMOS-compatible titanium nitride (TiN) that is plasmonic in the visible and near infrared. Films are grown on silicon, silicon dioxide, and epitaxially on magnesium oxide substrates. By optimizing the plasma exposure per growth cycle during PEALD, carbon and oxygen contamination are reduced, lowering undesirable loss. We use electron beam lithography to pattern TiN nanopillars with varying diameters on silicon in large-area arrays. In the first reported single-particle measurements on plasmonic TiN, we demonstrate size-tunable darkfield scattering spectroscopy in the visible and near infrared regimes. The optical properties of this CMOS-compatible material, combined with its high melting temperature and mechanical durability, comprise a step towards fully CMOS-integrated nanophotonic information processing.

  8. Reciprocity theory of homogeneous reactions

    Science.gov (United States)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  9. Moral Beliefs and Cognitive Homogeneity

    Directory of Open Access Journals (Sweden)

    Nevia Dolcini

    2018-04-01

    Full Text Available The Emotional Perception Model of moral judgment intends to account for experientialism about morality and moral reasoning. In explaining how moral beliefs are formed and applied in practical reasoning, the model attempts to overcome the mismatch between reason and action/desire: morality isn’t about reason for actions, yet moral beliefs, if caused by desires, may play a motivational role in (moral agency. The account allows for two kinds of moral beliefs: genuine moral beliefs, which enjoy a relation to desire, and motivationally inert moral beliefs acquired in ways other than experience. Such etiology-based dichotomy of concepts, I will argue, leads to the undesirable view of cognition as a non-homogeneous phenomenon. Moreover, the distinction between moral beliefs and moral beliefs would entail a further dichotomy encompassing the domain of moral agency: one and the same action might possibly be either genuine moral, or not moral, if acted by individuals lacking the capacity for moral feelings, such as psychopaths.

  10. Homogeneous modes of cosmological instantons

    Energy Technology Data Exchange (ETDEWEB)

    Gratton, Steven; Turok, Neil

    2001-06-15

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or Coleman{endash}De Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe.

  11. Homogeneous modes of cosmological instantons

    International Nuclear Information System (INIS)

    Gratton, Steven; Turok, Neil

    2001-01-01

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or ColemanendashDe Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe

  12. Homogeneity and thermodynamic identities in geometrothermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)

    2017-03-15

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)

  13. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine.

    Science.gov (United States)

    León, Zacarías; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-04-07

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL(-1), respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine

    International Nuclear Information System (INIS)

    Leon, Zacarias; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-01-01

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL -1 , respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  15. Homogenization in thermoelasticity: application to composite materials

    Energy Technology Data Exchange (ETDEWEB)

    Peyroux, R [Lab. de Mecanique et Genie Civil, Univ. Montpellier 2, 34 Montpellier (France); Licht, C [Lab. de Mecanique et Genie Civil, Univ. Montpellier 2, 34 Montpellier (France)

    1993-11-01

    One of the obstacles to the industrial use of metal matrix composite materials is the damage they rapidly undergo when they are subjected to cyclic thermal loadings; local thermal stresses of high level can develop, sometimes nearby or over the elastic limit, due to the mismatch of elastic and thermal coefficients between the fibers and the matrix. For the same reasons, early cracks can appear in composites like ceramic-ceramic. Therefore, we investigate the linear thermoelastic behaviour of heterogeneous materials, taking account of the isentropic coupling term in the heat conduction equation. In the case of periodic materials, recent results, using the homogenization theory, allowed us to describe macroscopic and microscopic behaviours of such materials. This paper is concerned with the numerical simulation of this problem by a finite element method, using a multiscale approach. (orig.).

  16. Towards a Scalable Fully-Implicit Fully-coupled Resistive MHD Formulation with Stabilized FE Methods

    Energy Technology Data Exchange (ETDEWEB)

    Shadid, J N; Pawlowski, R P; Banks, J W; Chacon, L; Lin, P T; Tuminaro, R S

    2009-06-03

    This paper presents an initial study that is intended to explore the development of a scalable fully-implicit stabilized unstructured finite element (FE) capability for low-Mach-number resistive MHD. The discussion considers the development of the stabilized FE formulation and the underlying fully-coupled preconditioned Newton-Krylov nonlinear iterative solver. To enable robust, scalable and efficient solution of the large-scale sparse linear systems generated by the Newton linearization, fully-coupled algebraic multilevel preconditioners are employed. Verification results demonstrate the expected order-of-acuracy for the stabilized FE discretization of a 2D vector potential form for the steady and transient solution of the resistive MHD system. In addition, this study puts forth a set of challenging prototype problems that include the solution of an MHD Faraday conduction pump, a hydromagnetic Rayleigh-Bernard linear stability calculation, and a magnetic island coalescence problem. Initial results that explore the scaling of the solution methods are presented on up to 4096 processors for problems with up to 64M unknowns on a CrayXT3/4. Additionally, a large-scale proof-of-capability calculation for 1 billion unknowns for the MHD Faraday pump problem on 24,000 cores is presented.

  17. A novel fully integrated handheld gamma camera

    International Nuclear Information System (INIS)

    Massari, R.; Ucci, A.; Campisi, C.; Scopinaro, F.; Soluri, A.

    2016-01-01

    In this paper, we present an innovative, fully integrated handheld gamma camera, namely designed to gather in the same device the gamma ray detector with the display and the embedded computing system. The low power consumption allows the prototype to be battery operated. To be useful in radioguided surgery, an intraoperative gamma camera must be very easy to handle since it must be moved to find a suitable view. Consequently, we have developed the first prototype of a fully integrated, compact and lightweight gamma camera for radiopharmaceuticals fast imaging. The device can operate without cables across the sterile field, so it may be easily used in the operating theater for radioguided surgery. The prototype proposed consists of a Silicon Photomultiplier (SiPM) array coupled with a proprietary scintillation structure based on CsI(Tl) crystals. To read the SiPM output signals, we have developed a very low power readout electronics and a dedicated analog to digital conversion system. One of the most critical aspects we faced designing the prototype was the low power consumption, which is mandatory to develop a battery operated device. We have applied this detection device in the lymphoscintigraphy technique (sentinel lymph node mapping) comparing the results obtained with those of a commercial gamma camera (Philips SKYLight). The results obtained confirm a rapid response of the device and an adequate spatial resolution for the use in the scintigraphic imaging. This work confirms the feasibility of a small gamma camera with an integrated display. This device is designed for radioguided surgery and small organ imaging, but it could be easily combined into surgical navigation systems.

  18. A novel fully integrated handheld gamma camera

    Energy Technology Data Exchange (ETDEWEB)

    Massari, R.; Ucci, A.; Campisi, C. [Biostructure and Bioimaging Institute (IBB), National Research Council of Italy (CNR), Rome (Italy); Scopinaro, F. [University of Rome “La Sapienza”, S. Andrea Hospital, Rome (Italy); Soluri, A., E-mail: alessandro.soluri@ibb.cnr.it [Biostructure and Bioimaging Institute (IBB), National Research Council of Italy (CNR), Rome (Italy)

    2016-10-01

    In this paper, we present an innovative, fully integrated handheld gamma camera, namely designed to gather in the same device the gamma ray detector with the display and the embedded computing system. The low power consumption allows the prototype to be battery operated. To be useful in radioguided surgery, an intraoperative gamma camera must be very easy to handle since it must be moved to find a suitable view. Consequently, we have developed the first prototype of a fully integrated, compact and lightweight gamma camera for radiopharmaceuticals fast imaging. The device can operate without cables across the sterile field, so it may be easily used in the operating theater for radioguided surgery. The prototype proposed consists of a Silicon Photomultiplier (SiPM) array coupled with a proprietary scintillation structure based on CsI(Tl) crystals. To read the SiPM output signals, we have developed a very low power readout electronics and a dedicated analog to digital conversion system. One of the most critical aspects we faced designing the prototype was the low power consumption, which is mandatory to develop a battery operated device. We have applied this detection device in the lymphoscintigraphy technique (sentinel lymph node mapping) comparing the results obtained with those of a commercial gamma camera (Philips SKYLight). The results obtained confirm a rapid response of the device and an adequate spatial resolution for the use in the scintigraphic imaging. This work confirms the feasibility of a small gamma camera with an integrated display. This device is designed for radioguided surgery and small organ imaging, but it could be easily combined into surgical navigation systems.

  19. Self-consolidating concrete homogeneity

    Directory of Open Access Journals (Sweden)

    Jarque, J. C.

    2007-08-01

    Full Text Available Concrete instability may lead to the non-uniform distribution of its properties. The homogeneity of self-consolidating concrete in vertically cast members was therefore explored in this study, analyzing both resistance to segregation and pore structure uniformity. To this end, two series of concretes were prepared, self-consolidating and traditional vibrated materials, with different w/c ratios and types of cement. The results showed that selfconsolidating concretes exhibit high resistance to segregation, albeit slightly lower than found in the traditional mixtures. The pore structure in the former, however, tended to be slightly more uniform, probably as a result of less intense bleeding. Such concretes are also characterized by greater bulk density, lower porosity and smaller mean pore size, which translates into a higher resistance to pressurized water. For pore diameters of over about 0.5 μm, however, the pore size distribution was found to be similar to the distribution in traditional concretes, with similar absorption rates.En este trabajo se estudia la homogeneidad de los hormigones autocompactantes en piezas hormigonadas verticalmente, determinando su resistencia a la segregación y la uniformidad de su estructura porosa, dado que la pérdida de estabilidad de una mezcla puede conducir a una distribución no uniforme de sus propiedades. Para ello se han fabricado dos tipos de hormigones, uno autocompactante y otro tradicional vibrado, con diferentes relaciones a/c y distintos tipos de cemento. Los resultados ponen de manifiesto que los hormigones autocompactantes presentan una buena resistencia a la segregación, aunque algo menor que la registrada en los hormigones tradicionales. A pesar de ello, su estructura porosa tiende a ser ligeramente más uniforme, debido probablemente a un menor sangrado. Asimismo, presentan una mayor densidad aparente, una menor porosidad y un menor tamaño medio de poro, lo que les confiere mejores

  20. Bounds for nonlinear composites via iterated homogenization

    Science.gov (United States)

    Ponte Castañeda, P.

    2012-09-01

    Improved estimates of the Hashin-Shtrikman-Willis type are generated for the class of nonlinear composites consisting of two well-ordered, isotropic phases distributed randomly with prescribed two-point correlations, as determined by the H-measure of the microstructure. For this purpose, a novel strategy for generating bounds has been developed utilizing iterated homogenization. The general idea is to make use of bounds that may be available for composite materials in the limit when the concentration of one of the phases (say phase 1) is small. It then follows from the theory of iterated homogenization that it is possible, under certain conditions, to obtain bounds for more general values of the concentration, by gradually adding small amounts of phase 1 in incremental fashion, and sequentially using the available dilute-concentration estimate, up to the final (finite) value of the concentration (of phase 1). Such an approach can also be useful when available bounds are expected to be tighter for certain ranges of the phase volume fractions. This is the case, for example, for the "linear comparison" bounds for porous viscoplastic materials, which are known to be comparatively tighter for large values of the porosity. In this case, the new bounds obtained by the above-mentioned "iterated" procedure can be shown to be much improved relative to the earlier "linear comparison" bounds, especially at low values of the porosity and high triaxialities. Consistent with the way in which they have been derived, the new estimates are, strictly, bounds only for the class of multi-scale, nonlinear composites consisting of two well-ordered, isotropic phases that are distributed with prescribed H-measure at each stage in the incremental process. However, given the facts that the H-measure of the sequential microstructures is conserved (so that the final microstructures can be shown to have the same H-measure), and that H-measures are insensitive to length scales, it is conjectured

  1. Position-dependency of Fuel Pin Homogenization in a Pressurized Water Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Woong; Kim, Yonghee [Korea Advanced Institute of Science and Technolgy, Daejeon (Korea, Republic of)

    2016-05-15

    By considering the multi-physics effects more comprehensively, it is possible to acquire precise local parameters which can result in a more accurate core design and safety assessment. A conventional approach of the multi-physics neutronics calculation for the pressurized water reactor (PWR) is to apply nodal methods. Since the nodal methods are basically based on the use of assembly-wise homogenized parameters, additional pin power reconstruction processes are necessary to obtain local power information. In the past, pin-by-pin core calculation was impractical due to the limited computational hardware capability. With the rapid advancement of computer technology, it is now perhaps quite practical to perform the direct pin-by-pin core calculation. As such, fully heterogeneous transport solvers based on both stochastic and deterministic methods have been developed for the acquisition of exact local parameters. However, the 3-D transport reactor analysis is still challenging because of the very high computational requirement. Position-dependency of the fuel pin homogenized cross sections in a small PWR core has been quantified via comparison of infinite FA and 2-D whole core calculations with the use of high-fidelity MC simulations. It is found that the pin environmental affect is especially obvious in FAs bordering the baffle reflector regions. It is also noted that the downscattering cross section is rather sensitive to the spectrum changes of the pins. It is expected that the pinwise homogenized cross sections need to be corrected somehow for accurate pin-by-pin core calculations in the peripheral region of the reactor core.

  2. Selection of suitable prodrug candidates for in vivo studies via in vitro studies; the correlation of prodrug stability in between cell culture homogenates and human tissue homogenates.

    Science.gov (United States)

    Tsume, Yasuhiro; Amidon, Gordon L

    2012-01-01

    To determine the correlations/discrepancies of drug stabilities between in the homogenates of human culture cells and of human tissues. Amino acid/dipeptide monoester prodrugs of floxuridine were chosen as the model drugs. The stabilities (half-lives) of floxuridine prodrugs in human tissues (pancreas, liver, and small intestine) homogenates were obtained and compared with ones in cell culture homogenates (AcPC-1, Capan-2, and Caco-2 cells) as well as human liver microsomes. The correlations of prodrug stability in human small bowel tissue homogenate vs. Caco-2 cell homogenate, human liver tissue homogenate vs. human liver microsomes, and human pancreatic tissue homogenate vs. pancreatic cell, AsPC-1 and Capan-2, homogenates were examined. The stabilities of floxuridine prodrugs in human small bowel homogenate exhibited the great correlation to ones in Caco-2 cell homogenate (slope = 1.0-1.3, r2 = 0.79-0.98). The stability of those prodrugs in human pancreas tissue homogenate also exhibited the good correlations to ones in AsPC-1 and Capan-2 cells homogenates (slope = 0.5-0.8, r2 = 0.58-0.79). However, the correlations of prodrug stabilities between in human liver tissue homogenates and in human liver microsomes were weaker than others (slope = 1.3-1.9, r2 = 0.07-0.24). The correlations of drug stabilities in cultured cell homogenates and in human tissue homogenates were compared. Those results exhibited wide range of correlations between in cell homogenate and in human tissue homogenate (r2 = 0.07 - 0.98). Those in vitro studies in cell homogenates would be good tools to predict drug stabilities in vivo and to select drug candidates for further developments. In the series of experiments, 5'-O-D-valyl-floxuridine and 5'-O-L-phenylalanyl-L-tyrosyl-floxuridine would be selected as candidates of oral drug targeting delivery for cancer chemotherapy due to their relatively good stabilities compared to other tested prodrugs.

  3. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.; Kronsbein, Cornelia; Legoll, Fré dé ric

    2015-01-01

    it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison

  4. Investigations into homogenization of electromagnetic metamaterials

    DEFF Research Database (Denmark)

    Clausen, Niels Christian Jerichau

    This dissertation encompasses homogenization methods, with a special interest into their applications to metamaterial homogenization. The first method studied is the Floquet-Bloch method, that is based on the assumption of a material being infinite periodic. Its field can then be expanded in term...

  5. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    Science.gov (United States)

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  6. Development of a new extraction method based on counter current salting-out homogenous liquid-liquid extraction followed by dispersive liquid-liquid microextraction: Application for the extraction and preconcentration of widely used pesticides from fruit juices.

    Science.gov (United States)

    Farajzadeh, Mir Ali; Feriduni, Behruz; Mogaddam, Mohammad Reza Afshar

    2016-01-01

    In this paper, a new extraction method based on counter current salting-out homogenous liquid-liquid extraction (CCSHLLE) followed by dispersive liquid-liquid microextraction (DLLME) has been developed for the extraction and preconcentration of widely used pesticides in fruit juice samples prior to their analysis by gas chromatography-flame ionization detection (GC-FID). In this method, initially, sodium chloride as a separation reagent is filled into a small column and a mixture of water (or fruit juice) and acetonitrile is passed through the column. By passing the mixture sodium chloride is dissolved and the fine droplets of acetonitrile are formed due to salting-out effect. The produced droplets go up through the remained mixture and collect as a separated layer. Then, the collected organic phase (acetonitrile) is removed with a syringe and mixed with 1,1,2,2-tetrachloroethane (extraction solvent at µL level). In the second step, for further enrichment of the analytes the above mixture is injected into 5 mL de-ionized water placed in a test tube with conical bottom in order to dissolve acetonitrile into water and to achieve a sedimented phase at µL-level volume containing the enriched analytes. Under the optimal extraction conditions (extraction solvent, 1.5 mL acetonitrile; pH, 7; flow rate, 0.5 mL min(-1); preconcentration solvent, 20 µL 1,1,2,2-tetrachloroethane; NaCl concentration; 5%, w/w; and centrifugation rate and time, 5000 rpm and 5 min, respectively), the extraction recoveries and enrichment factors ranged from 87% to 96% and 544 to 600, respectively. Repeatability of the proposed method, expressed as relative standard deviations, ranged from 2% to 6% for intra-day (n=6, C=250 or 500 µg L(-1)) and inter-days (n=4, C=250 or 500 µg L(-1)) precisions. Limits of detection are obtained between 2 and 12 µg L(-1). Finally, the proposed method is applied for the determination of the target pesticide residues in the juice samples. Copyright © 2015

  7. Homogeneity of Prototypical Attributes in Soccer Teams

    Directory of Open Access Journals (Sweden)

    Christian Zepp

    2015-09-01

    Full Text Available Research indicates that the homogeneous perception of prototypical attributes influences several intragroup processes. The aim of the present study was to describe the homogeneous perception of the prototype and to identify specific prototypical subcategories, which are perceived as homogeneous within sport teams. The sample consists of N = 20 soccer teams with a total of N = 278 athletes (age M = 23.5 years, SD = 5.0 years. The results reveal that subcategories describing the cohesiveness of the team and motivational attributes are mentioned homogeneously within sport teams. In addition, gender, identification, team size, and the championship ranking significantly correlate with the homogeneous perception of prototypical attributes. The results are discussed on the basis of theoretical and practical implications.

  8. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  9. The Homogeneous Interior-Point Algorithm: Nonsymmetric Cones, Warmstarting, and Applications

    DEFF Research Database (Denmark)

    Skajaa, Anders

    algorithms for these problems is still limited. The goal of this thesis is to investigate and shed light on two computational aspects of homogeneous interior-point algorithms for convex conic optimization: The first part studies the possibility of devising a homogeneous interior-point method aimed at solving...... problems involving constraints that require nonsymmetric cones in their formulation. The second part studies the possibility of warmstarting the homogeneous interior-point algorithm for conic problems. The main outcome of the first part is the introduction of a completely new homogeneous interior......-point algorithm designed to solve nonsymmetric convex conic optimization problems. The algorithm is presented in detail and then analyzed. We prove its convergence and complexity. From a theoretical viewpoint, it is fully competitive with other algorithms and from a practical viewpoint, we show that it holds lots...

  10. Pyroxene Homogenization and the Isotopic Systematics of Eucrites

    Science.gov (United States)

    Nyquist, L. E.; Bogard, D. D.

    1996-01-01

    The original Mg-Fe zoning of eucritic pyroxenes has in nearly all cases been partly homogenized, an observation that has been combined with other petrographic and compositional criteria to establish a scale of thermal "metamorphism" for eucrites. To evaluate hypotheses explaining development of conditions on the HED parent body (Vesta?) leading to pyroxene homogenization against their chronological implications, it is necessary to know whether pyroxene metamorphism was recorded in the isotopic systems. However, identifying the effects of the thermal metamorphism with specific effects in the isotopic systems has been difficult, due in part to a lack of correlated isotopic and mineralogical studies of the same eucrites. Furthermore, isotopic studies often place high demands on analytical capabilities, resulting in slow growth of the isotopic database. Additionally, some isotopic systems would not respond in a direct and sensitive way to pyroxene homogenization. Nevertheless, sufficient data exist to generalize some observations, and to identify directions of potentially fruitful investigations.

  11. Homogenization technique for strongly heterogeneous zones in research reactors

    International Nuclear Information System (INIS)

    Lee, J.T.; Lee, B.H.; Cho, N.Z.; Oh, S.K.

    1991-01-01

    This paper reports on an iterative homogenization method using transport theory in a one-dimensional cylindrical cell model developed to improve the homogenized cross sections fro strongly heterogeneous zones in research reactors. The flux-weighting homogenized cross sections are modified by a correction factor, the cell flux ratio under an albedo boundary condition. The albedo at the cell boundary is iteratively determined to reflect the geometry effects of the material properties of the adjacent cells. This method has been tested with a simplified core model of the Korea Multipurpose Research Reactor. The results demonstrate that the reaction rates of an off-center control shroud cell, the multiplication factor, and the power distribution of the reactor core are close to those of the fine-mesh heterogeneous transport model

  12. Homogeneity of Moral Judgment? Apprentices Solving Business Conflicts.

    Science.gov (United States)

    Beck, Klaus; Heinrichs, Karin; Minnameier, Gerhard; Parche-Kawik, Kirsten

    In an ongoing longitudinal study that started in 1994, the moral development of business apprentices is being studied. The focal point of this project is a critical analysis of L. Kohlberg's thesis of homogeneity, according to which people should judge every moral issue from the point of view of their "modal" stage (the most frequently…

  13. Simulations of fully deformed oscillating flux tubes

    Science.gov (United States)

    Karampelas, K.; Van Doorsselaere, T.

    2018-02-01

    Context. In recent years, a number of numerical studies have been focusing on the significance of the Kelvin-Helmholtz instability in the dynamics of oscillating coronal loops. This process enhances the transfer of energy into smaller scales, and has been connected with heating of coronal loops, when dissipation mechanisms, such as resistivity, are considered. However, the turbulent layer is expected near the outer regions of the loops. Therefore, the effects of wave heating are expected to be confined to the loop's external layers, leaving their denser inner parts without a heating mechanism. Aim. In the current work we aim to study the spatial evolution of wave heating effects from a footpoint driven standing kink wave in a coronal loop. Methods: Using the MPI-AMRVAC code, we performed ideal, three dimensional magnetohydrodynamic simulations of footpoint driven transverse oscillations of a cold, straight coronal flux tube, embedded in a hotter environment. We have also constructed forward models for our simulation using the FoMo code. Results: The developed transverse wave induced Kelvin-Helmholtz (TWIKH) rolls expand throughout the tube cross-section, and cover it entirely. This turbulence significantly alters the initial density profile, leading to a fully deformed cross section. As a consequence, the resistive and viscous heating rate both increase over the entire loop cross section. The resistive heating rate takes its maximum values near the footpoints, while the viscous heating rate at the apex. Conclusions: We conclude that even a monoperiodic driver can spread wave heating over the whole loop cross section, potentially providing a heating source in the inner loop region. Despite the loop's fully deformed structure, forward modelling still shows the structure appearing as a loop. A movie attached to Fig. 1 is available at http://https://www.aanda.org

  14. String pair production in non homogeneous backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, S. [Department of Physics “E. Fermi” University of Pisa, and INFN - Sezione di Pisa,Largo Pontecorvo, 3, Ed. C, 56127 Pisa (Italy); Rabinovici, E. [Racah Institute of Physics, The Hebrew University of Jerusalem,91904 Jerusalem (Israel); Tallarita, G. [Departamento de Ciencias, Facultad de Artes Liberales,Universidad Adolfo Ibáñez, Santiago 7941169 (Chile)

    2016-04-28

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  15. String pair production in non homogeneous backgrounds

    International Nuclear Information System (INIS)

    Bolognesi, S.; Rabinovici, E.; Tallarita, G.

    2016-01-01

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  16. Stochastic model of milk homogenization process using Markov's chain

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2016-01-01

    Full Text Available The process of development of a mathematical model of the process of homogenization of dairy products is considered in the work. The theory of Markov's chains was used in the development of the mathematical model, Markov's chain with discrete states and continuous parameter for which the homogenisation pressure is taken, being the basis for the model structure. Machine realization of the model is implemented in the medium of structural modeling MathWorks Simulink™. Identification of the model parameters was carried out by minimizing the standard deviation calculated from the experimental data for each fraction of dairy products fat phase. As the set of experimental data processing results of the micrographic images of fat globules of whole milk samples distribution which were subjected to homogenization at different pressures were used. Pattern Search method was used as optimization method with the Latin Hypercube search algorithm from Global Optimization Тoolbox library. The accuracy of calculations averaged over all fractions of 0.88% (the relative share of units, the maximum relative error was 3.7% with the homogenization pressure of 30 MPa, which may be due to the very abrupt change in properties from the original milk in the particle size distribution at the beginning of the homogenization process and the lack of experimental data at homogenization pressures of below the specified value. The mathematical model proposed allows to calculate the profile of volume and mass distribution of the fat phase (fat globules in the product, depending on the homogenization pressure and can be used in the laboratory and research of dairy products composition, as well as in the calculation, design and modeling of the process equipment of the dairy industry enterprises.

  17. Fully connected network of superconducting qubits in a cavity

    International Nuclear Information System (INIS)

    Tsomokos, Dimitris I; Ashhab, Sahel; Nori, Franco

    2008-01-01

    A fully connected qubit network is considered, where every qubit interacts with every other one. When the interactions between the qubits are homogeneous, the system is a special case of the finite Lipkin-Meshkov-Glick (LMG) model. We propose a natural implementation of this model using superconducting qubits in state-of-the-art circuit QED. The ground state, the low-lying energy spectrum and the dynamical evolution are investigated. We find that, under realistic conditions, highly entangled states of Greenberger-Horne-Zeilinger (GHZ) and W types can be generated. We also comment on the influence of disorder on the system and discuss the possibility of simulating complex quantum systems, such as Sherrington-Kirkpatrick (SK) spin glasses, with superconducting qubit networks.

  18. Dynamic contact angle cycling homogenizes heterogeneous surfaces.

    Science.gov (United States)

    Belibel, R; Barbaud, C; Mora, L

    2016-12-01

    In order to reduce restenosis, the necessity to develop the appropriate coating material of metallic stent is a challenge for biomedicine and scientific research over the past decade. Therefore, biodegradable copolymers of poly((R,S)-3,3 dimethylmalic acid) (PDMMLA) were prepared in order to develop a new coating exhibiting different custom groups in its side chain and being able to carry a drug. This material will be in direct contact with cells and blood. It consists of carboxylic acid and hexylic groups used for hydrophilic and hydrophobic character, respectively. The study of this material wettability and dynamic surface properties is of importance due to the influence of the chemistry and the potential motility of these chemical groups on cell adhesion and polymer kinetic hydrolysis. Cassie theory was used for the theoretical correction of contact angles of these chemical heterogeneous surfaces coatings. Dynamic Surface Analysis was used as practical homogenizer of chemical heterogeneous surfaces by cycling during many cycles in water. In this work, we confirmed that, unlike receding contact angle, advancing contact angle is influenced by the difference of only 10% of acidic groups (%A) in side-chain of polymers. It linearly decreases with increasing acidity percentage. Hysteresis (H) is also a sensitive parameter which is discussed in this paper. Finally, we conclude that cycling provides real information, thus avoiding theoretical Cassie correction. H(10)is the most sensible parameter to %A. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Ecological homogenization of residential macrosystems

    Science.gov (United States)

    Peter M. Groffman; Meghan Avolio; Jeannine Cavender-Bares; Neil D. Bettez; J. Morgan Grove; Sharon J. Hall; Sarah E. Hobbie; Kelli L. Larson; Susannah B. Lerman; Dexter H. Locke; James B. Heffernan; Jennifer L. Morse; Christopher Neill; Kristen C. Nelson; Jarlath O' Neil-Dunne; Diane E. Pataki; Colin Polsky; Rinku Roy Chowdhury; Tara L. E. Trammell

    2017-01-01

    Similarities in planning, development and culture within urban areas may lead to the convergence of ecological processes on continental scales. Transdisciplinary, multi-scale research is now needed to understand and predict the impact of human-dominated landscapes on ecosystem structure and function.

  20. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  1. Homogenization and Control of Lattice Structures

    National Research Council Canada - National Science Library

    Blankenship, G. L

    1985-01-01

    ...., trusses may be modeled by beam equations). Using a technique from the mathematics of asymptotic analysis called "homogenization," the author shows how such approximations may be derived in a systematic way that avoids errors made using...

  2. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.; Efendiev, Yalchin R.; Li, Guanglian; Savatorova, Viktoria

    2015-01-01

    , Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point

  3. A fully implantable rodent neural stimulator

    Science.gov (United States)

    Perry, D. W. J.; Grayden, D. B.; Shepherd, R. K.; Fallon, J. B.

    2012-02-01

    The ability to electrically stimulate neural and other excitable tissues in behaving experimental animals is invaluable for both the development of neural prostheses and basic neurological research. We developed a fully implantable neural stimulator that is able to deliver two channels of intra-cochlear electrical stimulation in the rat. It is powered via a novel omni-directional inductive link and includes an on-board microcontroller with integrated radio link, programmable current sources and switching circuitry to generate charge-balanced biphasic stimulation. We tested the implant in vivo and were able to elicit both neural and behavioural responses. The implants continued to function for up to five months in vivo. While targeted to cochlear stimulation, with appropriate electrode arrays the stimulator is well suited to stimulating other neurons within the peripheral or central nervous systems. Moreover, it includes significant on-board data acquisition and processing capabilities, which could potentially make it a useful platform for telemetry applications, where there is a need to chronically monitor physiological variables in unrestrained animals.

  4. Homogenized thermal conduction model for particulate foods

    OpenAIRE

    Chinesta , Francisco; Torres , Rafael; Ramón , Antonio; Rodrigo , Mari Carmen; Rodrigo , Miguel

    2002-01-01

    International audience; This paper deals with the definition of an equivalent thermal conductivity for particulate foods. An homogenized thermal model is used to asses the effect of particulate spatial distribution and differences in thermal conductivities. We prove that the spatial average of the conductivity can be used in an homogenized heat transfer model if the conductivity differences among the food components are not very large, usually the highest conductivity ratio between the foods ...

  5. Layout optimization using the homogenization method

    Science.gov (United States)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  6. Diffusion piecewise homogenization via flux discontinuity ratios

    International Nuclear Information System (INIS)

    Sanchez, Richard; Dante, Giorgio; Zmijarevic, Igor

    2013-01-01

    We analyze piecewise homogenization with flux-weighted cross sections and preservation of averaged currents at the boundary of the homogenized domain. Introduction of a set of flux discontinuity ratios (FDR) that preserve reference interface currents leads to preservation of averaged region reaction rates and fluxes. We consider the class of numerical discretizations with one degree of freedom per volume and per surface and prove that when the homogenization and computing meshes are equal there is a unique solution for the FDRs which exactly preserve interface currents. For diffusion sub-meshing we introduce a Jacobian-Free Newton-Krylov method and for all cases considered obtain an 'exact' numerical solution (eight digits for the interface currents). The homogenization is completed by extending the familiar full assembly homogenization via flux discontinuity factors to the sides of regions laying on the boundary of the piecewise homogenized domain. Finally, for the familiar nodal discretization we numerically find that the FDRs obtained with no sub-mesh (nearly at no cost) can be effectively used for whole-core diffusion calculations with sub-mesh. This is not the case, however, for cell-centered finite differences. (authors)

  7. Homogenization models for 2-D grid structures

    Science.gov (United States)

    Banks, H. T.; Cioranescu, D.; Rebnord, D. A.

    1992-01-01

    In the past several years, we have pursued efforts related to the development of accurate models for the dynamics of flexible structures made of composite materials. Rather than viewing periodicity and sparseness as obstacles to be overcome, we exploit them to our advantage. We consider a variational problem on a domain that has large, periodically distributed holes. Using homogenization techniques we show that the solution to this problem is in some topology 'close' to the solution of a similar problem that holds on a much simpler domain. We study the behavior of the solution of the variational problem as the holes increase in number, but decrease in size in such a way that the total amount of material remains constant. The result is an equation that is in general more complex, but with a domain that is simply connected rather than perforated. We study the limit of the solution as the amount of material goes to zero. This second limit will, in most cases, retrieve much of the simplicity that was lost in the first limit without sacrificing the simplicity of the domain. Finally, we show that these results can be applied to the case of a vibrating Love-Kirchhoff plate with Kelvin-Voigt damping. We rely heavily on earlier results of (Du), (CS) for the static, undamped Love-Kirchhoff equation. Our efforts here result in a modification of those results to include both time dependence and Kelvin-Voigt damping.

  8. Lagrangian statistics in compressible isotropic homogeneous turbulence

    Science.gov (United States)

    Yang, Yantao; Wang, Jianchun; Shi, Yipeng; Chen, Shiyi

    2011-11-01

    In this work we conducted the Direct Numerical Simulation (DNS) of a forced compressible isotropic homogeneous turbulence and investigated the flow statistics from the Lagrangian point of view, namely the statistics is computed following the passive tracers trajectories. The numerical method combined the Eulerian field solver which was developed by Wang et al. (2010, J. Comp. Phys., 229, 5257-5279), and a Lagrangian module for tracking the tracers and recording the data. The Lagrangian probability density functions (p.d.f.'s) have then been calculated for both kinetic and thermodynamic quantities. In order to isolate the shearing part from the compressing part of the flow, we employed the Helmholtz decomposition to decompose the flow field (mainly the velocity field) into the solenoidal and compressive parts. The solenoidal part was compared with the incompressible case, while the compressibility effect showed up in the compressive part. The Lagrangian structure functions and cross-correlation between various quantities will also be discussed. This work was supported in part by the China's Turbulence Program under Grant No.2009CB724101.

  9. Generalized quantum theory of recollapsing homogeneous cosmologies

    International Nuclear Information System (INIS)

    Craig, David; Hartle, James B.

    2004-01-01

    A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic 'J·dΣ' rule of quantum cosmology, as well as a generalization of this rule to generic initial states

  10. Quantum aspects of photon propagation in transparent infinite homogeneous media

    International Nuclear Information System (INIS)

    Nistor, Rudolf Emil

    2008-01-01

    The energy balance photon - medium, during the light travelling, through a specific continuous interaction between a single photon and a homogeneous, infinite medium (fully ionized plasma or a transparent dielectric), was studied. We obtained a wave equation for the interacting photon. To explain the interaction in quantum terms, we assume a certain photon - medium interaction energy, macroscopically materialized by the existence of the refractive index. It turns out that the interaction is of a scalar type, for vanishing rest mass and of spin 1 particle submitted both to scalar and vectorial fields. We found out an expression of the propagation equation of the photon through a non-dissipative medium, using a coupling between the photon spin S vector and the scalar interaction field ( E S vector,H S vector). (authors)

  11. Optimal truss and frame design from projected homogenization-based topology optimization

    DEFF Research Database (Denmark)

    Larsen, S. D.; Sigmund, O.; Groen, J. P.

    2018-01-01

    In this article, we propose a novel method to obtain a near-optimal frame structure, based on the solution of a homogenization-based topology optimization model. The presented approach exploits the equivalence between Michell’s problem of least-weight trusses and a compliance minimization problem...... using optimal rank-2 laminates in the low volume fraction limit. In a fully automated procedure, a discrete structure is extracted from the homogenization-based continuum model. This near-optimal structure is post-optimized as a frame, where the bending stiffness is continuously decreased, to allow...

  12. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  13. PVT Panels. Fully renewable and competitive

    International Nuclear Information System (INIS)

    Bakker, M.; Strootman, K.J.; Jong, M.J.M.

    2003-10-01

    A photovoltaic/thermal (PVT) panel is a combination of photovoltaic cells with a solar thermal collector, generating solar electricity and solar heat simultaneously. PVT panels generate more solar energy per unit surface area than a combination of separate PV panels and solar thermal collectors, and share the aesthetic advantage of PV. After several years of research, PVT panels have been developed into a product that is now ready for market introduction. One of the most promising system concepts, consisting of 25 m 2 of PVT panels and a ground coupled heat pump, has been simulated in TRNSYS, and has been found to be able to fully cover both the building related electricity and heat consumption, while keeping the long-term average ground temperature constant. The cost and payback time of such a system have been determined; it has been found that the payback time of this system is approximately two-thirds of the payback time of an identical system but with 21 m 2 of PV panels and 4 m 2 of solar thermal collectors. Finally, by looking at the expected growth in the PV and solar thermal collector market, the market potential for for PVT panels has been found to be very large

  14. A new fully automated TLD badge reader

    International Nuclear Information System (INIS)

    Kannan, S.; Ratna, P.; Kulkarni, M.S.

    2003-01-01

    At present personnel monitoring in India is being carried out using a number of manual and semiautomatic TLD badge Readers and the BARC TL dosimeter badge designed during 1970. Of late the manual TLD badge readers are almost completely replaced by semiautomatic readers with a number of performance improvements like use of hot gas heating to reduce the readout time considerably. PC based design with storage of glow curve for every dosimeter, on-line dose computation and printout of dose reports, etc. However the semiautomatic system suffers from the lack of a machine readable ID code on the badge and the physical design of the dosimeter card not readily compatible for automation. This paper describes a fully automated TLD badge Reader developed in the RSS Division, using a new TLD badge with machine readable ID code. The new PC based reader has a built-in reader for reading the ID code, in the form of an array of holes, on the dosimeter card. The reader has a number of self-diagnostic features to ensure a high degree of reliability. (author)

  15. Engineering aspects of a fully mirrored endoscope

    International Nuclear Information System (INIS)

    Terra, A.; Huber, A.; Schweer, B.; Mertens, Ph.; Arnoux, G.; Balshaw, N.; Brezinsek, S.; Egner, S.; Hartl, M.; Kampf, D.; Klammer, J.; Lambertz, H.T.; Morlock, C.; Murari, A.; Reindl, M.; Sanders, S.; Sergienko, G.; Spencer, G.

    2013-01-01

    Highlights: ► Replacement of JET diagnostics to match the new ITER-like Wall. ► The endoscope test ITER-like design with only mirror based optics. ► Withstanding and diagnostic capability during Plasma operation and disruptions. ► Engineering process from design to installation and procurement. -- Abstract: The development of optical diagnostics, like endoscopes, compatible with the ITER environment (metallic plasma facing components, neutron proof optics, etc.) is a challenge, but current tokamaks such as JET provide opportunities to test fully working concepts. This paper describes the engineering aspects of a fully mirrored endoscope that has recently been designed, procured and installed on JET. The system must operate in a very strict environment with high temperature, high magnetic fields up to B = 4 T and rapid field variations (∂B/∂t ∼ 100 T/s) that induce high stresses due to eddy currents in the front mirror assembly. It must be designed to withstand high mechanical loads especially during disruptions, which lead to acceleration of about 7 g at 14 Hz. For the JET endoscope, when the plasma thermal loading, direct and indirect, was added to the assumed disruption loads, the reserve factor, defined as a ratio of yield strength over summed up von Mises stresses, was close to 1 for the mirror components. To ensure reliable operation, several analyses were performed to evaluate the thermo-mechanical performance of the endoscope and a final validation was obtained from mechanical and thermal tests, before the system's final installation in May 2011. During the tests, stability of the field of view angle variation was kept below 1° despite the high thermal gradient on endoscope head (∂T/∂x ∼ 500 K/m). In parallel, to ensure long time operation and to prevent undesirable performance degradation, a shutter system was also implemented in order to reduce impurity deposition on in-vessel mirrors but also to allow in situ transmission calibration

  16. Fully 3D refraction correction dosimetry system

    International Nuclear Information System (INIS)

    Manjappa, Rakesh; Makki, S Sharath; Kanhirodan, Rajan; Kumar, Rajesh; Vasu, Ram Mohan

    2016-01-01

    The irradiation of selective regions in a polymer gel dosimeter results in an increase in optical density and refractive index (RI) at those regions. An optical tomography-based dosimeter depends on rayline path through the dosimeter to estimate and reconstruct the dose distribution. The refraction of light passing through a dose region results in artefacts in the reconstructed images. These refraction errors are dependant on the scanning geometry and collection optics. We developed a fully 3D image reconstruction algorithm, algebraic reconstruction technique-refraction correction (ART-rc) that corrects for the refractive index mismatches present in a gel dosimeter scanner not only at the boundary, but also for any rayline refraction due to multiple dose regions inside the dosimeter. In this study, simulation and experimental studies have been carried out to reconstruct a 3D dose volume using 2D CCD measurements taken for various views. The study also focuses on the effectiveness of using different refractive-index matching media surrounding the gel dosimeter. Since the optical density is assumed to be low for a dosimeter, the filtered backprojection is routinely used for reconstruction. We carry out the reconstructions using conventional algebraic reconstruction (ART) and refractive index corrected ART (ART-rc) algorithms. The reconstructions based on FDK algorithm for cone-beam tomography has also been carried out for comparison. Line scanners and point detectors, are used to obtain reconstructions plane by plane. The rays passing through dose region with a RI mismatch does not reach the detector in the same plane depending on the angle of incidence and RI. In the fully 3D scanning setup using 2D array detectors, light rays that undergo refraction are still collected and hence can still be accounted for in the reconstruction algorithm. It is found that, for the central region of the dosimeter, the usable radius using ART-rc algorithm with water as RI matched

  17. Fully 3D refraction correction dosimetry system.

    Science.gov (United States)

    Manjappa, Rakesh; Makki, S Sharath; Kumar, Rajesh; Vasu, Ram Mohan; Kanhirodan, Rajan

    2016-02-21

    The irradiation of selective regions in a polymer gel dosimeter results in an increase in optical density and refractive index (RI) at those regions. An optical tomography-based dosimeter depends on rayline path through the dosimeter to estimate and reconstruct the dose distribution. The refraction of light passing through a dose region results in artefacts in the reconstructed images. These refraction errors are dependant on the scanning geometry and collection optics. We developed a fully 3D image reconstruction algorithm, algebraic reconstruction technique-refraction correction (ART-rc) that corrects for the refractive index mismatches present in a gel dosimeter scanner not only at the boundary, but also for any rayline refraction due to multiple dose regions inside the dosimeter. In this study, simulation and experimental studies have been carried out to reconstruct a 3D dose volume using 2D CCD measurements taken for various views. The study also focuses on the effectiveness of using different refractive-index matching media surrounding the gel dosimeter. Since the optical density is assumed to be low for a dosimeter, the filtered backprojection is routinely used for reconstruction. We carry out the reconstructions using conventional algebraic reconstruction (ART) and refractive index corrected ART (ART-rc) algorithms. The reconstructions based on FDK algorithm for cone-beam tomography has also been carried out for comparison. Line scanners and point detectors, are used to obtain reconstructions plane by plane. The rays passing through dose region with a RI mismatch does not reach the detector in the same plane depending on the angle of incidence and RI. In the fully 3D scanning setup using 2D array detectors, light rays that undergo refraction are still collected and hence can still be accounted for in the reconstruction algorithm. It is found that, for the central region of the dosimeter, the usable radius using ART-rc algorithm with water as RI matched

  18. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    International Nuclear Information System (INIS)

    Moutsopoulos, George

    2013-01-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre–Petrov types and discuss the warped de Sitter spacetime. (paper)

  19. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    Science.gov (United States)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  20. Investigation of terbium scandate as an alternative gate dielectric in fully depleted transistors

    OpenAIRE

    Roeckerath, M.; Lopes, J. M. J.; Durgun Özben, E.; Urban, C.; Schubert, J.; Mantl, S.; Jia, Y.; Schlom, D.G.

    2010-01-01

    Terbium scandate thin films were deposited by e-gun evaporation on (100) silicon substrates. Rutherford backscattering spectrometry and x-ray diffraction studies revealed homogeneous chemical compositions of the films. A dielectric constant of 26 and CV-curves with small hystereses were measured as well as low leakage current densities of < 1 nA/cm(2). Fully depleted n-type field-effect transistors on thin silicon-on-insulator substrates with terbium scandate gate dielectrics were fabricated ...

  1. Rapid biotic homogenization of marine fish assemblages

    Science.gov (United States)

    Magurran, Anne E.; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J.; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  2. Two-Dimensional Homogeneous Fermi Gases

    Science.gov (United States)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  3. Online screening of homogeneous catalyst performance using reaction detection mass spectrometry

    NARCIS (Netherlands)

    Martha, C.T.; Elders, N.; Krabbe, J.G.; Kool, J.; Niessen, W.M.A.; Orru, R.V.A.; Irth, H.

    2008-01-01

    An integrated online screening system was developed to rapidly screen homogeneous catalysts for activity toward a selected synthesis. The continuous-flow system comprises standard HPLC pumps for the delivery of substrates, an HPLC autosampler for the injection of homogeneous catalysts, a

  4. A calderón-preconditioned single source combined field integral equation for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdés, Felipe

    2011-06-01

    A new regularized single source equation for analyzing scattering from homogeneous penetrable objects is presented. The proposed equation is a linear combination of a Calderón-preconditioned single source electric field integral equation and a single source magnetic field integral equation. The equation is immune to low-frequency and dense-mesh breakdown, and free from spurious resonances. Unlike dual source formulations, this equation involves operator products that cannot be discretized using standard procedures for discretizing standalone electric, magnetic, and combined field operators. Instead, the single source equation proposed here is discretized using a recently developed technique that achieves a well-conditioned mapping from div- to curl-conforming function spaces, thereby fully respecting the space mapping properties of the operators involved, and guaranteeing accuracy and stability. Numerical results show that the proposed equation and discretization technique give rise to rapidly convergent solutions. They also validate the equation\\'s resonant free character. © 2006 IEEE.

  5. A fully integrated 16 channel digitally trimmed pulse shaping amplifier

    International Nuclear Information System (INIS)

    Hearn, W.E.; Wright, M.E.

    1993-11-01

    A fully integrated CMOS pulse shaping amplifier has been developed at LBL. All frequency dependent networks are included on the chip. Provision is made for tuning to compensate for process variations. The overall architecture and details of the circuitry are discussed. Test results are presented

  6. Homogeneous Charge Compression Ignition Combustion: Challenges and Proposed Solutions

    Directory of Open Access Journals (Sweden)

    Mohammad Izadi Najafabadi

    2013-01-01

    Full Text Available Engine and car manufacturers are experiencing the demand concerning fuel efficiency and low emissions from both consumers and governments. Homogeneous charge compression ignition (HCCI is an alternative combustion technology that is cleaner and more efficient than the other types of combustion. Although the thermal efficiency and NOx emission of HCCI engine are greater in comparison with traditional engines, HCCI combustion has several main difficulties such as controlling of ignition timing, limited power output, and weak cold-start capability. In this study a literature review on HCCI engine has been performed and HCCI challenges and proposed solutions have been investigated from the point view of Ignition Timing that is the main problem of this engine. HCCI challenges are investigated by many IC engine researchers during the last decade, but practical solutions have not been presented for a fully HCCI engine. Some of the solutions are slow response time and some of them are technically difficult to implement. So it seems that fully HCCI engine needs more investigation to meet its mass-production and the future research and application should be considered as part of an effort to achieve low-temperature combustion in a wide range of operating conditions in an IC engine.

  7. Internal homogenization: effective permittivity of a coated sphere.

    Science.gov (United States)

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  8. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  9. Flows and chemical reactions in homogeneous mixtures

    CERN Document Server

    Prud'homme, Roger

    2013-01-01

    Flows with chemical reactions can occur in various fields such as combustion, process engineering, aeronautics, the atmospheric environment and aquatics. The examples of application chosen in this book mainly concern homogeneous reactive mixtures that can occur in propellers within the fields of process engineering and combustion: - propagation of sound and monodimensional flows in nozzles, which may include disequilibria of the internal modes of the energy of molecules; - ideal chemical reactors, stabilization of their steady operation points in the homogeneous case of a perfect mixture and c

  10. Homogenized description and retrieval method of nonlinear metasurfaces

    Science.gov (United States)

    Liu, Xiaojun; Larouche, Stéphane; Smith, David R.

    2018-03-01

    A patterned, plasmonic metasurface can strongly scatter incident light, functioning as an extremely low-profile lens, filter, reflector or other optical device. When the metasurface is patterned uniformly, its linear optical properties can be expressed using effective surface electric and magnetic polarizabilities obtained through a homogenization procedure. The homogenized description of a nonlinear metasurface, however, presents challenges both because of the inherent anisotropy of the medium as well as the much larger set of potential wave interactions available, making it challenging to assign effective nonlinear parameters to the otherwise inhomogeneous layer of metamaterial elements. Here we show that a homogenization procedure can be developed to describe nonlinear metasurfaces, which derive their nonlinear response from the enhanced local fields arising within the structured plasmonic elements. With the proposed homogenization procedure, we are able to assign effective nonlinear surface polarization densities to a nonlinear metasurface, and link these densities to the effective nonlinear surface susceptibilities and averaged macroscopic pumping fields across the metasurface. These effective nonlinear surface polarization densities are further linked to macroscopic nonlinear fields through the generalized sheet transition conditions (GSTCs). By inverting the GSTCs, the effective nonlinear surface susceptibilities of the metasurfaces can be solved for, leading to a generalized retrieval method for nonlinear metasurfaces. The application of the homogenization procedure and the GSTCs are demonstrated by retrieving the nonlinear susceptibilities of a SiO2 nonlinear slab. As an example, we investigate a nonlinear metasurface which presents nonlinear magnetoelectric coupling in near infrared regime. The method is expected to apply to any patterned metasurface whose thickness is much smaller than the wavelengths of operation, with inclusions of arbitrary geometry

  11. Parametric dependence of two-plasmon decay in homogeneous plasma

    International Nuclear Information System (INIS)

    Dimitrijevic, Dejan R

    2010-01-01

    A hydrodynamic model of two-plasmon decay in a homogeneous plasma slab near the quarter-critical density is constructed in order to improve our understanding of the spatio-temporal evolution of the daughter electron plasma waves in plasma in the course of the instability. The scaling of the amplitudes of the participating waves with laser and plasma parameters is investigated. The secondary coupling of two daughter electron plasma waves with an ion-acoustic wave is assumed to be the principal mechanism of saturation of the instability. The impact of the inherently nonresonant nature of this secondary coupling on the development of two-plasmon decay is researched and it is shown to significantly influence the electron plasma wave dynamics. Its inclusion leads to nonuniformity of the spatial profile of the instability and causes the burst-like pattern of the instability development, which should result in the burst-like hot-electron production in homogeneous plasma.

  12. Niobium bonds as homogeneous catalysts for the cyclotrimerization of alkynes

    International Nuclear Information System (INIS)

    Du Toit, C.J.

    1984-05-01

    The activity and selectivity of the catalytic system MX 5 with M = Nb or Ta and X = Cl - or Br - and (CH 3 ) 3 TaCl 2 with regard to the reaction rate and product formation in the reaction with alkynes were evaluated. A measuring technique was developed with which the reaction path of the oligomerization reactions of alkynes with homogeneous catalysts in a nitrogen atmosphere can be followed spectrophotometrically

  13. Lower bounds for the circuit size of partially homogeneous polynomials

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 225, č. 4 (2017), s. 639-657 ISSN 1072-3374 Institutional support: RVO:67985840 Keywords : partially homogeneous polynomials * polynomials Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) https://link.springer.com/article/10.1007/s10958-017-3483-4

  14. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    Science.gov (United States)

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  15. Effect of Microstructure Constraints on the Homogenized Elastic Constants of Elastomeric Sylgard/GMB Syntactic Foam.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Judith Alice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Steck, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brown, Judith Alice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    Previous numerical studies of Sylgard filled with glass microballoons (GMB) have relied on various microstructure idealizations to achieve a large range of volume fractions with high mesh quality. This study investigates how different microstructure idealizations and constraints affect the apparent homogenized elastic constants in the virgin state of the material, in which all GMBs are intact and perfectly bonded to the Sylgard matrix, and in the fully damaged state of the material in which all GMBs are destroyed. In the latter state, the material behaves as an elastomeric foam. Four microstructure idealizations are considered relating to how GMBs are packed into a representative volume element (RVE): (1) no boundary penetration nor GMB-GMB overlap, (2) GMB-GMB overlap, (3) boundary penetration, and (4) boundary penetration and GMB-GMB overlap. First order computational homogenization with kinematically uniform displacement boundary conditions (KUBCs) was employed to determine the homogenized (apparent) bulk and shear moduli for the four microstructure idealizations in the intact and fully broken GMB material states. It was found that boundary penetration has a significant effect on the shear modulus for microstructures with intact GMBs, but that neither boundary penetration nor GMB overlap have a significant effect on homogenized properties for microstructures with fully broken GMBs. The primary conclusion of the study is that future investigations into Sylgard/GMB micromechanics should either force GMBs to stay within the RVE fully and/or use periodic BCs (PBCs) to eliminate the boundary penetration issues. The implementation of PBCs requires the improvement of existing tools in Sandia’s Sierra/SM code.

  16. CMOS current controlled fully balanced current conveyor

    International Nuclear Information System (INIS)

    Wang Chunhua; Zhang Qiujing; Liu Haiguang

    2009-01-01

    This paper presents a current controlled fully balanced second-generation current conveyor circuit (CF-BCCII). The proposed circuit has the traits of fully balanced architecture, and its X-Y terminals are current controllable. Based on the CFBCCII, two biquadratic universal filters are also proposed as its applications. The CFBCCII circuits and the two filters were fabricated with chartered 0.35-μm CMOS technology; with ±1.65 V power supply voltage, the total power consumption of the CFBCCII circuit is 3.6 mW. Comparisons between measured and HSpice simulation results are also given.

  17. Fully exponentially correlated wavefunctions for small atoms

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Frank E. [Department of Physics, University of Utah, Salt Lake City, UT 84112 and Quantum Theory Project, University of Florida, P.O. Box 118435, Gainesville, FL 32611 (United States)

    2015-01-22

    Fully exponentially correlated atomic wavefunctions are constructed from exponentials in all the interparticle coordinates, in contrast to correlated wavefunctions of the Hylleraas form, in which only the electron-nuclear distances occur exponentially, with electron-electron distances entering only as integer powers. The full exponential correlation causes many-configuration wavefunctions to converge with expansion length more rapidly than either orbital formulations or correlated wavefunctions of the Hylleraas type. The present contribution surveys the effectiveness of fully exponentially correlated functions for the three-body system (the He isoelectronic series) and reports their application to a four-body system (the Li atom)

  18. Synthesis of silica nanosphere from homogeneous and ...

    Indian Academy of Sciences (India)

    WINTEC

    avoid it, reaction in heterogeneous system using CTABr was carried out. Nanosized silica sphere with ... Homogeneous system contains a mixture of ethanol, water, aqueous ammonia and ... heated to 823 K (rate, 1 K/min) in air and kept at this.

  19. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  20. Homogeneous nucleation of water in synthetic air

    NARCIS (Netherlands)

    Fransen, M.A.L.J.; Sachteleben, E.; Hruby, J.; Smeulders, D.M.J.; DeMott, P.J.; O'Dowd, C.D.

    2013-01-01

    Homogeneous nucleation rates for water vapor in synthetic air are measured by means of a Pulse-Expansion Wave Tube (PEWT). A comparison of the experimental nucleation rates with the Classical Nucleation Theory (CNT) shows that a more elaborated model is necessary to describe supercooled water

  1. Homogeneity in Social Groups of Iraqis

    NARCIS (Netherlands)

    Gresham, J.; Saleh, F.; Majid, S.

    With appreciation to the Royal Institute for Inter-Faith Studies for initiating the Second World Congress for Middle Eastern Studies, this paper summarizes findings on homogeneity in community-level social groups derived from inter-ethnic research conducted during 2005 among Iraqi Arabs and Kurds

  2. Abelian gauge theories on homogeneous spaces

    International Nuclear Information System (INIS)

    Vassilevich, D.V.

    1992-07-01

    An algebraic technique of separation of gauge modes in Abelian gauge theories on homogeneous spaces is proposed. An effective potential for the Maxwell-Chern-Simons theory on S 3 is calculated. A generalization of the Chern-Simons action is suggested and analysed with the example of SU(3)/U(1) x U(1). (author). 11 refs

  3. Benchmarking homogenization algorithms for monthly data

    Czech Academy of Sciences Publication Activity Database

    Venema, V. K. C.; Mestre, O.; Aquilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertačník, G.; Szentimrey, T.; Štěpánek, Petr; Zahradníček, Pavel; Viarre, J.; Mueller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Duran, M. P.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    Roč. 8, č. 1 (2012), s. 89-115 ISSN 1814-9324 Institutional support: RVO:67179843 Keywords : climate data * instrumental time-series * greater alpine region * homogeneity test * variability * inhomogeneities Subject RIV: EH - Ecology, Behaviour Impact factor: 3.556, year: 2012

  4. Extension theorems for homogenization on lattice structures

    Science.gov (United States)

    Miller, Robert E.

    1992-01-01

    When applying homogenization techniques to problems involving lattice structures, it is necessary to extend certain functions defined on a perforated domain to a simply connected domain. This paper provides general extension operators which preserve bounds on derivatives of order l. Only the special case of honeycomb structures is considered.

  5. Homogeneous scintillating LKr/Xe calorimeters

    International Nuclear Information System (INIS)

    Chen, M.; Mullins, M.; Pelly, D.; Shotkin, S.; Sumorok, K.; Akyuz, D.; Chen, E.; Gaudreau, M.P.J.; Bolozdynya, A.; Tchernyshev, V.; Goritchev, P.; Khovansky, V.; Koutchenkov, A.; Kovalenko, A.; Lebedenko, V.; Vinogradov, V.; Gusev, L.; Sheinkman, V.; Krasnokutsky, R.N.; Shuvalov, R.S.; Fedyakin, N.N.; Sushkov, V.; Akopyan, M.; Doke, T.; Kikuchi, J.; Hitachi, A.; Kashiwagi, T.; Masuda, K.; Shibamura, E.; Ishida, N.; Sugimoto, S.

    1993-01-01

    Recent R and D work on full length scintillating homogeneous liquid xenon/krypton (LXe/Kr) cells has established the essential properties for precision EM calorimeters: In-situ calibration using α's, radiation hardness as well as the uniformity required for δE/E≅0.5% for e/γ's above 50 GeV. (orig.)

  6. Traffic planning for non-homogeneous traffic

    Indian Academy of Sciences (India)

    Western traffic planning methodologies mostly address the concerns of homogeneous traffic and therefore often prove inadequate in solving problems involving ... Transportation Research and Injury Prevention Programme, Indian Institute of Technology, Hauz Khas, New Delhi 110 016; Civil and Architectural Engineering ...

  7. Inverse acoustic problem of N homogeneous scatterers

    DEFF Research Database (Denmark)

    Berntsen, Svend

    2002-01-01

    The three-dimensional inverse acoustic medium problem of N homogeneous objects with known geometry and location is considered. It is proven that one scattering experiment is sufficient for the unique determination of the complex wavenumbers of the objects. The mapping from the scattered fields...

  8. Mach's principle in spatially homogeneous spacetimes

    International Nuclear Information System (INIS)

    Tipler, F.J.

    1978-01-01

    On the basis of Mach's Principle it is concluded that the only singularity-free solution to the empty space Einstein equations is flat space. It is shown that the only singularity-free solution to the empty space Einstein equations which is spatially homogeneous and globally hyperbolic is in fact suitably identified Minkowski space. (Auth.)

  9. Water Filtration through Homogeneous Granulated Charge

    Directory of Open Access Journals (Sweden)

    A. M. Krautsou

    2005-01-01

    Full Text Available General relationship for calculation of water filtration through homogeneous granulated charge has been obtained. The obtained relationship has been compared with experimental data. Discrepancies between calculated and experimental values do not exceed 6 % throughout the entire investigated range.

  10. Time-domain single-source integral equations for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdés, Felipe

    2013-03-01

    Single-source time-domain electric-and magnetic-field integral equations for analyzing scattering from homogeneous penetrable objects are presented. Their temporal discretization is effected by using shifted piecewise polynomial temporal basis functions and a collocation testing procedure, thus allowing for a marching-on-in-time (MOT) solution scheme. Unlike dual-source formulations, single-source equations involve space-time domain operator products, for which spatial discretization techniques developed for standalone operators do not apply. Here, the spatial discretization of the single-source time-domain integral equations is achieved by using the high-order divergence-conforming basis functions developed by Graglia alongside the high-order divergence-and quasi curl-conforming (DQCC) basis functions of Valdés The combination of these two sets allows for a well-conditioned mapping from div-to curl-conforming function spaces that fully respects the space-mapping properties of the space-time operators involved. Numerical results corroborate the fact that the proposed procedure guarantees accuracy and stability of the MOT scheme. © 2012 IEEE.

  11. Development for fully organic, low solid and high temperature resistant well completion fluid%全有机低固相抗高温完并液体系开发

    Institute of Scientific and Technical Information of China (English)

    周文; 王贵松; 任艳增; 王滨

    2012-01-01

    The system composition of the fully organic and low solid content well completion fluid, which density is 1.65g/cm3 , was introduced. The compatibility of supporting treatment agent was evaluated, and sin- gle agent screening process was introduced. Meanwhile, it determines the ratio of the basic solution with high- density organic salts in this system m (formate and organic compound salt) : m (salt crystallization inhibitors) : m (water) 286 : 44 : 40 : 100, as well as the screening and the evaluation of ancillary treatment agent- anti high temperature reservoir protective agent YH-YB01. The system is widely used in many scope of applica tion and is a good temperature resistance, low solid, high-density completion fluid system.%主要介绍了密度为1.65g/cm2的全有机低固相抗高温完井液体系的组成、单剂筛选过程和配套处理剂配伍性的评价。确定了体系中高密度有机盐基础溶液的配比,即m(甲酸盐):m(有机复合盐):m(盐结晶抑制剂):m(水)为286:44:40:100,以及配套的处理剂——抗高温储层保护剂YH—YBO]的筛选和评价,该体系适用范围广泛,是一种良好的抗高温、低固相、高密度完井液体系。

  12. CMS on the GRID: Toward a fully distributed computing architecture

    International Nuclear Information System (INIS)

    Innocente, Vincenzo

    2003-01-01

    The computing systems required to collect, analyse and store the physics data at LHC would need to be distributed and global in scope. CMS is actively involved in several grid-related projects to develop and deploy a fully distributed computing architecture. We present here recent developments of tools for automating job submission and for serving data to remote analysis stations. Plans for further test and deployment of a production grid are also described

  13. Quantum Fully Homomorphic Encryption with Verification

    DEFF Research Database (Denmark)

    Alagic, Gorjan; Dulek, Yfke; Schaffner, Christian

    2017-01-01

    Fully-homomorphic encryption (FHE) enables computation on encrypted data while maintaining secrecy. Recent research has shown that such schemes exist even for quantum computation. Given the numerous applications of classical FHE (zero-knowledge proofs, secure two-party computation, obfuscation, e...

  14. Fully conditional specification in multivariate imputation

    NARCIS (Netherlands)

    van Buuren, S.; Brand, J. P.L.; Groothuis-Oudshoorn, C. G.M.; Rubin, D. B.

    2006-01-01

    The use of the Gibbs sampler with fully conditionally specified models, where the distribution of each variable given the other variables is the starting point, has become a popular method to create imputations in incomplete multivariate data. The theoretical weakness of this approach is that the

  15. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  16. Faster Fully-Dynamic minimum spanning forest

    DEFF Research Database (Denmark)

    Holm, Jacob; Rotenberg, Eva; Wulff-Nilsen, Christian

    2015-01-01

    We give a new data structure for the fully-dynamic minimum spanning forest problem in simple graphs. Edge updates are supported in O(log4 n/log logn) expected amortized time per operation, improving the O(log4 n) amortized bound of Holm et al. (STOC’98, JACM’01).We also provide a deterministic data...

  17. Feasibility Study of Aseptic Homogenization: Affecting Homogenization Steps on Quality of Sterilized Coconut Milk

    Directory of Open Access Journals (Sweden)

    Phungamngoen Chanthima

    2016-01-01

    Full Text Available Coconut milk is one of the most important protein-rich food sources available today. Separation of an emulsion into an aqueous phase and cream phase is commonly occurred and this leads an unacceptably physical defect of either fresh or processed coconut milk. Since homogenization steps are known to affect the stability of coconut milk. This work was aimed to study the effect of homogenization steps on quality of coconut milk. The samples were subject to high speed homogenization in the range of 5000-15000 rpm under sterilize temperatures at 120-140 °C for 15 min. The result showed that emulsion stability increase with increasing speed of homogenization. The lower fat particles were generated and easy to disperse in continuous phase lead to high stability. On the other hand, the stability of coconut milk decreased, fat globule increased, L value decreased and b value increased when the high sterilization temperature was applied. Homogenization after heating led to higher stability than homogenization before heating due to the reduced particle size of coconut milk after aggregation during sterilization process. The results implied that homogenization after sterilization process might play an important role on the quality of the sterilized coconut milk.

  18. The coherent state on SUq(2) homogeneous space

    International Nuclear Information System (INIS)

    Aizawa, N; Chakrabarti, R

    2009-01-01

    The generalized coherent states for quantum groups introduced by Jurco and StovIcek are studied for the simplest example SU q (2) in full detail. It is shown that the normalized SU q (2) coherent states enjoy the property of completeness, and allow a resolution of the unity. This feature is expected to play a key role in the application of these coherent states in physical models. The homogeneous space of SU q (2), i.e. the q-sphere of Podles, is reproduced in complex coordinates by using the coherent states. Differential calculus in the complex form on the homogeneous space is developed. The high spin limit of the SU q (2) coherent states is also discussed.

  19. Neutron transport equation - indications on homogenization and neutron diffusion

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1992-06-01

    In PWR nuclear reactor, the practical study of the neutrons in the core uses diffusion equation to describe the problem. On the other hand, the most correct method to describe these neutrons is to use the Boltzmann equation, or neutron transport equation. In this paper, we give some theoretical indications to obtain a diffusion equation from the general transport equation, with some simplifying hypothesis. The work is organised as follows: (a) the most general formulations of the transport equation are presented: integro-differential equation and integral equation; (b) the theoretical approximation of this Boltzmann equation by a diffusion equation is introduced, by the way of asymptotic developments; (c) practical homogenization methods of transport equation is then presented. In particular, the relationships with some general and useful methods in neutronic are shown, and some homogenization methods in energy and space are indicated. A lot of other points of view or complements are detailed in the text or the remarks

  20. Computer modeling of homogenization of boric acid in IRIS pressurizer

    International Nuclear Information System (INIS)

    Rives Sanz, Ronny; Montesinos Otero, Maria Elena; Gonzalez Mantecon, Javier

    2015-01-01

    Integral layout of nuclear reactor IRIS makes possible the elimination of the spray system; which is usually used to mitigate in-surge transient and help to boron homogenization. The study of transients with deficiencies in the boron homogenization in this technology is very important, because they can cause disturbances in the reactor power and insert a strong reactivity in the core. The aim of the present research is to model the IRIS pressurizer using the CFX code searching for designs alternatives that guaranteed its intrinsic security, focused on the phenomena before mentioned. A symmetric tri dimensional model equivalent to 1/8 of the total geometry was adopted to reduce mesh size and minimize processing time. The relationships are programmed and incorporated into the code. This paper discusses the model developed and the behavior of the system for representative transients sequences. The results of the analyzed IRIS transients could be applied to the design of the pressurizer internal structures and components. (Author)

  1. Notes on a homogeneous reactor project; Idees sur un projet de reacteur homogene

    Energy Technology Data Exchange (ETDEWEB)

    Benveniste, J; Bernot, J; Eidelman, D; Grenon, M; Portes, L; Raspaud, G; Tachon, J; Vendryes, G [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Berthod, L; Cohen de Lara, G; Delachanal, M; Fontanet, P; Halbronn, G [Societe Grenobloise d' Etudes et d' Applications Hydrauliques, 38 (France)

    1958-07-01

    An attempt has been made to develop certain ideas concerning homogeneous reactors. The project under consideration is based on the simultaneous use of a suspension of uranium dispersed in heavy or light water and of boiling in the reactor for heat extraction. However, the studies of suspensions and of boiling are relatively independent and can also be developed for reactors of different types using one or the other. Our aim is a minimum investment in fissile material; for this we propose to extract the steam directly from the core and to make use of a cyclone to accelerate this extraction; a cyclone-type circulation creating a field of increasing tangential velocities of the fluid towards the axis causes the droplets of vapour to accelerate towards the axial vortex in which they are collected; the steam output is then evacuated to the external heat utilisation system, for example an exchanger of the condenser-boiler type. The input speed of water into the reactor being one of the important parameters in the running of the pile, a spiral supply input chamber is used, allowing this speed to be regulated in amount and direction. (author)Fren. [French] Nous nous sommes attaches a developper certaines idees relatives aux piles homogenes. Le projet que nous etudions est base sur l'emploi simultane d'une suspension contenant de l'uranium disperse dans l'eau legere ou lourde et de l'ebullition dans le reacteur pour l'extraction de chaleur. Neanmoins, les etudes de suspensions et d'ebullition sont relativement independantes et peuvent egalement etre developpees pour des reacteurs de type different utilisant l'une ou l'autre. Le but que nous cherchons a atteindre est un investissement minimum en matiere fissile; pour cela, nous proposons d'extraire directement la vapeur dans le coeur et de recourir a un dispositif cyclone pour accelerer cette extraction; une circulation type cyclone creant un champ de vitesses tangentielles du fluide croissantes veraxe a pour effet d

  2. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Improvement of the homogeneity of atomized particles dispersed in high uranium density research reactor fuels

    International Nuclear Information System (INIS)

    Kim, Chang-Kyu; Kim, Ki-Hwan; Park, Jong-Man; Lee, Yoon-Sang; Lee, Don-Bae; Sohn, Woong-Hee; Hong, Soon-Hyung

    1998-01-01

    A study on improving the homogeneous dispersion of atomized spherical particles in fuel meats has been performed in connection with the development of high uranium density fuel. In comparing various mixing methods, the better homogeneity of the mixture could be obtained as in order of Spex mill, V-shape tumbler mixer, and off-axis rotating drum mixer. The Spex mill mixer required some laborious work because of its small capacity per batch. Trough optimizing the rotating speed parameter for the V-shape tumbler mixer, almost the same homogeneity as with the Spex mill could be obtained. The homogeneity of the extruded fuel meats appeared to improve through extrusion. All extruded fuel meats with U 3 Si powder of 50-volume % had fairly smooth surfaces. The homogeneity of fuel meats by V-shaped tumbler mixer revealed to be fairly good on micrographs. (author)

  4. Microstructure evolution during homogenization of Al–Mn–Fe–Si alloys: Modeling and experimental results

    International Nuclear Information System (INIS)

    Du, Q.; Poole, W.J.; Wells, M.A.; Parson, N.C.

    2013-01-01

    Microstructure evolution during the homogenization heat treatment of Al–Mn–Fe–Si, or AA3xxx, alloys has been investigated using a combination of modeling and experimental studies. The model is fully coupled to CALculation PHAse Diagram (CALPHAD) software and has explicitly taken into account the two different length scales for diffusion encountered in modeling the homogenization process. The model is able to predict the evolution of all the important microstructural features during homogenization, including the inhomogeneous spatial distribution of dispersoids and alloying elements in solution, the dispersoid number density and the size distribution, and the type and fraction of intergranular constituent particles. Experiments were conducted using four direct chill (DC) cast AA3xxx alloys subjected to various homogenization treatments. The resulting microstructures were then characterized using a range of characterization techniques, including optical and electron microscopy, electron micro probe analysis, field emission gun scanning electron microscopy, and electrical resistivity measurements. The model predictions have been compared with the experimental measurements to validate the model. Further, it has been demonstrated that the validated model is able to predict the effects of alloying elements (e.g. Si and Mn) on microstructure evolution. It is concluded that the model provides a time and cost effective tool in optimizing and designing industrial AA3xxx alloy chemistries and homogenization heat treatments

  5. Design of SC solenoid with high homogeneity

    International Nuclear Information System (INIS)

    Yang Xiaoliang; Liu Zhong; Luo Min; Luo Guangyao; Kang Qiang; Tan Jie; Wu Wei

    2014-01-01

    A novel kind of SC (superconducting) solenoid coil is designed to satisfy the homogeneity requirement of the magnetic field. In this paper, we first calculate the current density distribution of the solenoid coil section through the linear programming method. Then a traditional solenoid and a nonrectangular section solenoid are designed to produce a central field up to 7 T with a homogeneity to the greatest extent. After comparison of the two solenoid coils designed in magnet field quality, fabrication cost and other aspects, the new design of the nonrectangular section of a solenoid coil can be realized through improving the techniques of framework fabrication and winding. Finally, the outlook and error analysis of this kind of SC magnet coil are also discussed briefly. (authors)

  6. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  7. Core homogenization method for pebble bed reactors

    International Nuclear Information System (INIS)

    Kulik, V.; Sanchez, R.

    2005-01-01

    This work presents a core homogenization scheme for treating a stochastic pebble bed loading in pebble bed reactors. The reactor core is decomposed into macro-domains that contain several pebble types characterized by different degrees of burnup. A stochastic description is introduced to account for pebble-to-pebble and pebble-to-helium interactions within a macro-domain as well as for interactions between macro-domains. Performance of the proposed method is tested for the PROTEUS and ASTRA critical reactor facilities. Numerical simulations accomplished with the APOLLO2 transport lattice code show good agreement with the experimental data for the PROTEUS reactor facility and with the TRIPOLI4 Monte Carlo simulations for the ASTRA reactor configuration. The difference between the proposed method and the traditional volume-averaged homogenization technique is negligible while only one type of fuel pebbles present in the system, but it grows rapidly with the level of pebble heterogeneity. (authors)

  8. Genetic homogeneity of Fascioloides magna in Austria.

    Science.gov (United States)

    Husch, Christian; Sattmann, Helmut; Hörweg, Christoph; Ursprung, Josef; Walochnik, Julia

    2017-08-30

    The large American liver fluke, Fascioloides magna, is an economically relevant parasite of both domestic and wild ungulates. F. magna was repeatedly introduced into Europe, for the first time already in the 19th century. In Austria, a stable population of F. magna has established in the Danube floodplain forests southeast of Vienna. The aim of this study was to determine the genetic diversity of F. magna in Austria. A total of 26 individuals from various regions within the known area of distribution were investigated for their cytochrome oxidase subunit 1 (cox1) and nicotinamide dehydrogenase subunit 1 (nad1) gene haplotypes. Interestingly, all 26 individuals revealed one and the same haplotype, namely concatenated haplotype Ha5. This indicates a homogenous population of F. magna in Austria and may argue for a single introduction. Alternatively, genetic homogeneity might also be explained by a bottleneck effect and/or genetic drift. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Shape optimization in biomimetics by homogenization modelling

    International Nuclear Information System (INIS)

    Hoppe, Ronald H.W.; Petrova, Svetozara I.

    2003-08-01

    Optimal shape design of microstructured materials has recently attracted a great deal of attention in material science. The shape and the topology of the microstructure have a significant impact on the macroscopic properties. The present work is devoted to the shape optimization of new biomorphic microcellular ceramics produced from natural wood by biotemplating. We are interested in finding the best material-and-shape combination in order to achieve the optimal prespecified performance of the composite material. The computation of the effective material properties is carried out using the homogenization method. Adaptive mesh-refinement technique based on the computation of recovered stresses is applied in the microstructure to find the homogenized elasticity coefficients. Numerical results show the reliability of the implemented a posteriori error estimator. (author)

  10. Cell homogenization methods for pin-by-pin core calculations tested in slab geometry

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Kitamura, Yasunori; Yamane, Yoshihiro

    2004-01-01

    In this paper, performances of spatial homogenization methods for fuel or non-fuel cells are compared in slab geometry in order to facilitate pin-by-pin core calculations. Since the spatial homogenization methods were mainly developed for fuel assemblies, systematic study of their performance for the cell-level homogenization has not been carried out. Importance of cell-level homogenization is recently increasing since the pin-by-pin mesh core calculation in actual three-dimensional geometry, which is less approximate approach than current advanced nodal method, is getting feasible. Four homogenization methods were investigated in this paper; the flux-volume weighting, the generalized equivalence theory, the superhomogenization (SPH) method and the nonlinear iteration method. The last one, the nonlinear iteration method, was tested as the homogenization method for the first time. The calculations were carried out in simplified colorset assembly configurations of PWR, which are simulated by slab geometries, and homogenization performances were evaluated through comparison with the reference cell-heterogeneous calculations. The calculation results revealed that the generalized equivalence theory showed best performance. Though the nonlinear iteration method can significantly reduce homogenization error, its performance was not as good as that of the generalized equivalence theory. Through comparison of the results obtained by the generalized equivalence theory and the superhomogenization method, important byproduct was obtained; deficiency of the current superhomogenization method, which could be improved by incorporating the 'cell-level discontinuity factor between assemblies', was clarified

  11. Homogenization of variational inequalities for obstacle problems

    International Nuclear Information System (INIS)

    Sandrakov, G V

    2005-01-01

    Results on the convergence of solutions of variational inequalities for obstacle problems are proved. The variational inequalities are defined by a non-linear monotone operator of the second order with periodic rapidly oscillating coefficients and a sequence of functions characterizing the obstacles. Two-scale and macroscale (homogenized) limiting variational inequalities are obtained. Derivation methods for such inequalities are presented. Connections between the limiting variational inequalities and two-scale and macroscale minimization problems are established in the case of potential operators.

  12. Quantum groups and quantum homogeneous spaces

    International Nuclear Information System (INIS)

    Kulish, P.P.

    1994-01-01

    The usefulness of the R-matrix formalism and the reflection equations is demonstrated on examples of the quantum group covariant algebras (quantum homogeneous spaces): quantum Minkowski space-time, quantum sphere and super-sphere. The irreducible representations of some covariant algebras are constructed. The generalization of the reflection equation to super case is given and the existence of the quasiclassical limits is pointed out. (orig.)

  13. Process to produce homogenized reactor fuels

    International Nuclear Information System (INIS)

    Hart, P.E.; Daniel, J.L.; Brite, D.W.

    1980-01-01

    The fuels consist of a mixture of PuO 2 and UO 2 . In order to increase the homogeneity of mechanically mixed fuels the pellets are sintered in a hydrogen atmosphere with a sufficiently low oxygen potential. This results in a reduction of Pu +4 to Pu +3 . By the reduction process water vapor is obtained increasing the pressure within the PuO 2 particles and causing PuO 2 to be pressed into the uranium oxide structure. (DG) [de

  14. Homogeneous scintillating LKr/Xe calorimeters

    Energy Technology Data Exchange (ETDEWEB)

    Chen, M.; Mullins, M.; Pelly, D.; Shotkin, S.; Sumorok, K. (Lab. for Nuclear Science, MIT, Cambridge, MA (United States)); Akyuz, D.; Chen, E.; Gaudreau, M.P.J. (Plasma Fusion Center, MIT, Cambridge, MA (United States)); Bolozdynya, A.; Tchernyshev, V.; Goritchev, P.; Khovansky, V.; Koutchenkov, A.; Kovalenko, A.; Lebedenko, V.; Vinogradov, V.; Gusev, L.; Sheinkman, V. (ITEP, Moscow (Russia)); Krasnokutsky, R.N.; Shuvalov, R.S.; Fedyakin, N.N.; Sushkov, V. (IHEP, Serpukhov (Russia)); Akopyan, M. (Inst. for Nuclear Research, Moscow (Russia)); Doke, T.; Kikuchi, J.; Hitachi, A.; Kashiwagi, T. (Science and Eng. Res. Lab., Waseda Univ., Tokyo (Japan)); Masuda, K.; Shibamura, E. (Saitama Coll. of Health (Japan)); Ishida, N. (Seikei Univ. (Japan)); Sugimoto, S. (INS, Univ. Tokyo (Japan))

    1993-03-20

    Recent R and D work on full length scintillating homogeneous liquid xenon/krypton (LXe/Kr) cells has established the essential properties for precision EM calorimeters: In-situ calibration using [alpha]'s, radiation hardness as well as the uniformity required for [delta]E/E[approx equal]0.5% for e/[gamma]'s above 50 GeV. (orig.).

  15. Fluoroscopic screen which is optically homogeneous

    International Nuclear Information System (INIS)

    1975-01-01

    A high efficiency fluoroscopic screen for X-ray examination consists of an optically homogeneous crystal plate of fluorescent material such as activated cesium iodide, supported on a transparent protective plate, with the edges of the assembly beveled and optically coupled to a light absorbing compound. The product is dressed to the desired thickness and provided with an X-ray-transparent light-opaque cover. (Auth.)

  16. Correlated equilibria in homogenous good Bertrand competition

    DEFF Research Database (Denmark)

    Jann, Ole; Schottmüller, Christoph

    2015-01-01

    We show that there is a unique correlated equilibrium, identical to the unique Nash equilibrium, in the classic Bertrand oligopoly model with homogenous goods and identical marginal costs. This provides a theoretical underpinning for the so-called "Bertrand paradox'' as well as its most general f...... formulation to date. Our proof generalizes to asymmetric marginal costs and arbitrarily many players in the following way: The market price cannot be higher than the second lowest marginal cost in any correlated equilibrium....

  17. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan

    2016-06-06

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  18. Some properties of spatially homogeneous spacetimes

    International Nuclear Information System (INIS)

    Coomer, G.C.

    1979-01-01

    This paper discusses two features of the universe which are influenced in a fundamental way by the spacetime geometry of the universe. The first is the growth of density fluctuations in the early stages of the evolution of the universe. The second is the propagation of electromagnetic radiation in the universe. A spatially homogeneous universe is assumed in both discussions. The gravitational instability theory of galaxy formation is investigated for a viscous fluid and for a charged, conducting fluid with a magnetic field added as a perturbation. It is found that the growth rate of density perturbations in both cases is lower than in the perfect fluid case. Spatially homogeneous but nonisotropic spacetimes are investigated next. Two perfect fluid solutions of Einstein's field equations are found which have spacelike hypersurfaces with Bianchi type II geometry. An expression for the spectrum of the cosmic microwave background radiation in a spatially homogeneous but nonisotropic universe is found. The expression is then used to determine the angular distribution of the intensity of the radiation in the simpler of the two solutions. When accepted values of the matter density and decoupling temperature are inserted into this solution, values for the age of the universe and the time of decoupling are obtained which agree reasonably well with the values of the standard model of the universe

  19. Commensurability effects in holographic homogeneous lattices

    International Nuclear Information System (INIS)

    Andrade, Tomas; Krikun, Alexander

    2016-01-01

    An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as “homogeneous holographic lattices.' Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities. However, it is not clear whether they are able to capture other lattice effects which are of interest in condensed matter. In this paper we investigate this question focusing our attention on the phenomenon of commensurability, which arises when the lattice scale is tuned to be equal to (an integer multiple of) another momentum scale in the system. We do so by studying the formation of spatially modulated phases in various models of homogeneous holographic lattices. Our results indicate that the onset of the instability is controlled by the near horizon geometry, which for insulating solutions does carry information about the lattice. However, we observe no sharp connection between the characteristic momentum of the broken phase and the lattice pitch, which calls into question the applicability of these models to the physics of commensurability.

  20. Homogeneous Biosensing Based on Magnetic Particle Labels

    Science.gov (United States)

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  1. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschö pe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  2. Testing Homogeneity with the Galaxy Fossil Record

    CERN Document Server

    Hoyle, Ben; Jimenez, Raul; Heavens, Alan; Clarkson, Chris; Maartens, Roy

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past lightcone, while observations take place on the lightcone. The history of star formation rates (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked Luminous Red Galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal area contiguous sky patches and 10 redshift slices (0.2homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is n...

  3. Homogeneous CdTe quantum dots-carbon nanotubes heterostructures

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Kayo Oliveira [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Bettini, Jefferson [Laboratório Nacional de Nanotecnologia, Centro Nacional de Pesquisa em Energia e Materiais, CEP 13083-970, Campinas, SP (Brazil); Ferrari, Jefferson Luis [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil); Schiavon, Marco Antonio, E-mail: schiavon@ufsj.edu.br [Grupo de Pesquisa em Química de Materiais – (GPQM), Departamento de Ciências Naturais, Universidade Federal de São João del-Rei, Campus Dom Bosco, Praça Dom Helvécio, 74, CEP 36301-160, São João del-Rei, MG (Brazil)

    2015-01-15

    The development of homogeneous CdTe quantum dots-carbon nanotubes heterostructures based on electrostatic interactions has been investigated. We report a simple and reproducible non-covalent functionalization route that can be accomplished at room temperature, to prepare colloidal composites consisting of CdTe nanocrystals deposited onto multi-walled carbon nanotubes (MWCNTs) functionalized with a thin layer of polyelectrolytes by layer-by-layer technique. Specifically, physical adsorption of polyelectrolytes such as poly (4-styrene sulfonate) and poly (diallyldimethylammonium chloride) was used to deagglomerate and disperse MWCNTs, onto which we deposited CdTe quantum dots coated with mercaptopropionic acid (MPA), as surface ligand, via electrostatic interactions. Confirmation of the CdTe quantum dots/carbon nanotubes heterostructures was done by transmission and scanning electron microscopies (TEM and SEM), dynamic-light scattering (DLS) together with absorption, emission, Raman and infrared spectroscopies (UV–vis, PL, Raman and FT-IR). Almost complete quenching of the PL band of the CdTe quantum dots was observed after adsorption on the MWCNTs, presumably through efficient energy transfer process from photoexcited CdTe to MWCNTs. - Highlights: • Highly homogeneous CdTe-carbon nanotubes heterostructures were prepared. • Simple and reproducible non-covalent functionalization route. • CdTe nanocrystals homogeneously deposited onto multi-walled carbon nanotubes. • Efficient energy transfer process from photoexcited CdTe to MWCNTs.

  4. FMFT. Fully massive four-loop tadpoles

    Energy Technology Data Exchange (ETDEWEB)

    Pikelner, Andrey [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik

    2017-07-15

    We present FMFT - a package written in FORM that evaluates four-loop fully massive tadpole Feynman diagrams. It is a successor of the MATAD package that has been successfully used to calculate many renormalization group functions at three-loop order in a wide range of quantum field theories especially in the Standard Model. We describe an internal structure of the package and provide some examples of its usage.

  5. FMFT: fully massive four-loop tadpoles

    Science.gov (United States)

    Pikelner, Andrey

    2018-03-01

    We present FMFT - a package written in FORM that evaluates four-loop fully massive tadpole Feynman diagrams. It is a successor of the MATAD package that has been successfully used to calculate many renormalization group functions at three-loop order in a wide range of quantum field theories especially in the Standard Model. We describe an internal structure of the package and provide some examples of its usage.

  6. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  7. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    2015-01-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed. This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.

  8. Development of a fully automated, web-based, tailored intervention promoting regular physical activity among insufficiently active adults with type 2 diabetes: integrating the I-change model, self-determination theory, and motivational interviewing components.

    Science.gov (United States)

    Moreau, Michel; Gagnon, Marie-Pierre; Boudreau, François

    2015-02-17

    Type 2 diabetes is a major challenge for Canadian public health authorities, and regular physical activity is a key factor in the management of this disease. Given that fewer than half of people with type 2 diabetes in Canada are sufficiently active to meet the recommendations, effective programs targeting the adoption of regular physical activity (PA) are in demand for this population. Many researchers argue that Web-based, tailored interventions targeting PA are a promising and effective avenue for sedentary populations like Canadians with type 2 diabetes, but few have described the detailed development of this kind of intervention. This paper aims to describe the systematic development of the Web-based, tailored intervention, Diabète en Forme, promoting regular aerobic PA among adult Canadian francophones with type 2 diabetes. This paper can be used as a reference for health professionals interested in developing similar interventions. We also explored the integration of theoretical components derived from the I-Change Model, Self-Determination Theory, and Motivational Interviewing, which is a potential path for enhancing the effectiveness of tailored interventions on PA adoption and maintenance. The intervention development was based on the program-planning model for tailored interventions of Kreuter et al. An additional step was added to the model to evaluate the intervention's usability prior to the implementation phase. An 8-week intervention was developed. The key components of the intervention include a self-monitoring tool for PA behavior, a weekly action planning tool, and eight tailored motivational sessions based on attitude, self-efficacy, intention, type of motivation, PA behavior, and other constructs and techniques. Usability evaluation, a step added to the program-planning model, helped to make several improvements to the intervention prior to the implementation phase. The intervention development cost was about CDN $59,700 and took approximately

  9. Applications of high and ultra high pressure homogenization for food safety

    Directory of Open Access Journals (Sweden)

    Francesca Patrignani

    2016-08-01

    Full Text Available Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time (LTLT and high temperature short time (HTST treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure (HHP, pulsed electric field (PEF, ultrasound (US and high pressure homogenization (HPH. This last technique has been demonstrated to have a great potential to provide fresh-like products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of high pressure homogenization against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered

  10. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhijie, E-mail: zhijie.xu@pnnl.gov [Computational Mathematics Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Joshi, Vineet [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Hu, Shenyang [Reactor Materials & Mechanical Design, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Paxton, Dean [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lavender, Curt [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Burkes, Douglas [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2016-04-01

    Low-enriched U-22at% Mo (U–10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U–10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  11. Homogeneous forming technology of composite materials and its application to dispersion nuclear fuel

    International Nuclear Information System (INIS)

    Hong, Soon Hyun; Ryu, Ho Jin; Sohn, Woong Hee; Kim, Chang Kyu

    1997-01-01

    Powder metallurgy processing technique of metal matrix composites is reviewed and its application to process homogeneous dispersion nuclear fuel is considered. The homogeneous mixing of reinforcement with matrix powders is very important step to process metal matrix composites. The reinforcement with matrix powders is very important step to process metal matrix composites. The reinforcement can be ceramic particles, whiskers or chopped fibers having high strength and high modulus. The blended powders are consolidated into billets and followed by various deformation processing, such as extrusion, forging, rolling or spinning into final usable shapes. Dispersion nuclear fuel is a class of metal matrix composite consisted of dispersed U-compound fuel particles and metallic matrix. Dispersion nuclear fuel is fabricated by powder metallurgy process such as hot pressing followed by hot extrusion, which is similar to that of SiC/Al metal matrix composite. The fabrication of homogeneous dispersion nuclear fuel is very difficult mainly due to the inhomogeneous mixing characteristics of the powders from quite different densities between uranium alloy powders and aluminum powders. In order to develop homogeneous dispersion nuclear fuel, it is important to investigate the effect of powder characteristics and mixing techniques on homogeneity of dispersion nuclear fuel. An new quantitative analysis technique of homogeneity is needed to be developed for more accurate analysis of homogeneity in dispersion nuclear fuel. (author). 28 refs., 7 figs., 1tab

  12. Fully automated MRI-guided robotics for prostate brachytherapy

    International Nuclear Information System (INIS)

    Stoianovici, D.; Vigaru, B.; Petrisor, D.; Muntener, M.; Patriciu, A.; Song, D.

    2008-01-01

    The uncertainties encountered in the deployment of brachytherapy seeds are related to the commonly used ultrasound imager and the basic instrumentation used for the implant. An alternative solution is under development in which a fully automated robot is used to place the seeds according to the dosimetry plan under direct MRI-guidance. Incorporation of MRI-guidance creates potential for physiological and molecular image-guided therapies. Moreover, MRI-guided brachytherapy is also enabling for re-estimating dosimetry during the procedure, because with the MRI the seeds already implanted can be localised. An MRI compatible robot (MrBot) was developed. The robot is designed for transperineal percutaneous prostate interventions, and customised for fully automated MRI-guided brachytherapy. With different end-effectors, the robot applies to other image-guided interventions of the prostate. The robot is constructed of non-magnetic and dielectric materials and is electricity free using pneumatic actuation and optic sensing. A new motor (PneuStep) was purposely developed to set this robot in motion. The robot fits alongside the patient in closed-bore MRI scanners. It is able to stay fully operational during MR imaging without deteriorating the quality of the scan. In vitro, cadaver, and animal tests showed millimetre needle targeting accuracy, and very precise seed placement. The robot tested without any interference up to 7T. The robot is the first fully automated robot to function in MRI scanners. Its first application is MRI-guided seed brachytherapy. It is capable of automated, highly accurate needle placement. Extensive testing is in progress prior to clinical trials. Preliminary results show that the robot may become a useful image-guided intervention instrument. (author)

  13. Investigation of methods for hydroclimatic data homogenization

    Science.gov (United States)

    Steirou, E.; Koutsoyiannis, D.

    2012-04-01

    We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, 'SNHT for single shifts', was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

  14. Argentina to fully privatize state owned YPF

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Argentina's Congress has voted to fully privatize state petroleum company Yacimientos Petroliferos Fiscales (YPF), a move the government expects to net at least $8 billion. Despite some political opposition, the vote was 119-10 in favor, with one abstention and opposition party members refusing to participate in the vote. Argentina's President Carlos Menem had threatened to authorize YPF privatization by decree if there was no quorum for a vote. YPF is responsible for 40% of Argentina's oil production. The country h as been self-sufficient in crude since 1982. Current production is 563,472 b/d, and proved reserves of oil and gas are valued at $7 billion

  15. Exponential Stability of Switched Positive Homogeneous Systems

    Directory of Open Access Journals (Sweden)

    Dadong Tian

    2017-01-01

    Full Text Available This paper studies the exponential stability of switched positive nonlinear systems defined by cooperative and homogeneous vector fields. In order to capture the decay rate of such systems, we first consider the subsystems. A sufficient condition for exponential stability of subsystems with time-varying delays is derived. In particular, for the corresponding delay-free systems, we prove that this sufficient condition is also necessary. Then, we present a sufficient condition of exponential stability under minimum dwell time switching for the switched positive nonlinear systems. Some results in the previous literature are extended. Finally, a numerical example is given to demonstrate the effectiveness of the obtained results.

  16. Diffusion piecewise homogenization via flux discontinuity factors

    International Nuclear Information System (INIS)

    Sanchez, Richard; Zmijarevic, Igor

    2011-01-01

    We analyze the calculation of flux discontinuity factors (FDFs) for use with piecewise subdomain assembly homogenization. These coefficients depend on the numerical mesh used to compute the diffusion problem. When the mesh has a single degree of freedom on subdomain interfaces the solution is unique and can be computed independently per subdomain. For all other cases we have implemented an iterative calculation for the FDFs. Our numerical results show that there is no solution to this nonlinear problem but that the iterative algorithm converges towards FDFs values that reproduce subdomains reaction rates with a relatively high precision. In our test we have included both the GET and black-box FDFs. (author)

  17. Tensor harmonic analysis on homogenous space

    International Nuclear Information System (INIS)

    Wrobel, G.

    1997-01-01

    The Hilbert space of tensor functions on a homogenous space with the compact stability group is considered. The functions are decomposed onto a sum of tensor plane waves (defined in the text), components of which are transformed by irreducible representations of the appropriate transformation group. The orthogonality relation and the completeness relation for tensor plane waves are found. The decomposition constitutes a unitary transformation, which allows to obtain the Parseval equality. The Fourier components can be calculated by means of the Fourier transformation, the form of which is given explicitly. (author)

  18. Multifractal spectra in homogeneous shear flow

    Science.gov (United States)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  19. Development of the fast, simple and fully validated high performance liquid chromatographic method with diode array detector for quantification of testosterone esters in an oil-based injectable dosage form.

    Science.gov (United States)

    Kozlik, Petr; Tircova, Barbora

    2016-11-01

    Counterfeit steroids are available on the black market, ultimately to consumers who believe they are buying a legitimate pharmaceutical item from the labeled company. In many cases, counterfeit steroids can contain lower doses or some products can be overdosed. This can unwittingly expose users to a significant health risks. The mixture of testosterone propionate, phenylpropionate, isocaproate and decanoate in an oil-based injectable dosage form belongs to the one of the most misused illicit drugs by a variety of athletes. This study developed a new, fast, simple and reliable HPLC method combined with a simple sample preparation step to determine testosterone propionate, phenylpropionate, isocaproate and decanoate in an oil-based injectable dosage form without the use of sophisticated and expensive instrumentation. The developed analytical procedure provides high throughput of samples where LC analysis takes only 6min and sample preparation of oil matrix in one step takes approximately 10min with precision ranging from 1.03 to 3.38% (RSD), and accuracy (relative error %) within ±2.01%. This method was found to be precise, linear, accurate, sensitive, selective and robust for routine application in screening of commercial pharmaceutical products based on content of mentioned testosterone esters in their oil-based injectable dosage form for counterfeit drugs. This method was successfully applied to the analysis of nine samples of commercial testosterone mixtures purchased from various sources and will be further used as an effective screening method for determination of previously mentioned testosterone esters in samples confiscated by Institute of Forensic Science (Slovakia) during the illegal trade. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  1. Soft Ultrathin Electronics Innervated Adaptive Fully Soft Robots.

    Science.gov (United States)

    Wang, Chengjun; Sim, Kyoseung; Chen, Jin; Kim, Hojin; Rao, Zhoulyu; Li, Yuhang; Chen, Weiqiu; Song, Jizhou; Verduzco, Rafael; Yu, Cunjiang

    2018-03-01

    Soft robots outperform the conventional hard robots on significantly enhanced safety, adaptability, and complex motions. The development of fully soft robots, especially fully from smart soft materials to mimic soft animals, is still nascent. In addition, to date, existing soft robots cannot adapt themselves to the surrounding environment, i.e., sensing and adaptive motion or response, like animals. Here, compliant ultrathin sensing and actuating electronics innervated fully soft robots that can sense the environment and perform soft bodied crawling adaptively, mimicking an inchworm, are reported. The soft robots are constructed with actuators of open-mesh shaped ultrathin deformable heaters, sensors of single-crystal Si optoelectronic photodetectors, and thermally responsive artificial muscle of carbon-black-doped liquid-crystal elastomer (LCE-CB) nanocomposite. The results demonstrate that adaptive crawling locomotion can be realized through the conjugation of sensing and actuation, where the sensors sense the environment and actuators respond correspondingly to control the locomotion autonomously through regulating the deformation of LCE-CB bimorphs and the locomotion of the robots. The strategy of innervating soft sensing and actuating electronics with artificial muscles paves the way for the development of smart autonomous soft robots. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Trial production and characterisation of fully calcia-stabilised zirconia

    International Nuclear Information System (INIS)

    George, A.M.; Karkhanavala, M.D.

    1980-01-01

    A process for manufacture of stabilized zirconia powder has been developed. The process is quite versatile since stabilization is achieved at relatively low temperatures (950deg - 1000deg C) and can be used for manufacture of either fully or partially calcia-stabilized zirconia. A 100 Kg trial batch of fully stabilized zirconia powder was produced accordingly at the Indian Rare Earths Ltd. plant and its characteristics were evaluated by XRD, microscopy, surface area and density measurements. The powder on firing at 1400deg C showed considerable volume shrinkage, as expected. On manually compacting with a phosphatic binder and firing for 8-10 hrs at 1300deg-1400deg C sintered shapes having bulk densities around 80-85% T.D. are easily obtained. Details of the measurements and the prospective industrial applications of the material are discussed. (auth.)

  3. Plane wave interaction with a homogeneous warm plasma sphere

    International Nuclear Information System (INIS)

    Ruppin, R.

    1975-01-01

    A Mie type theory for the scattering and absorption properties of a homogeneous warm plasma sphere is developed. The theory is applied to the calculation of the extinction cross section of plasma spheres, and the effects of Landau damping and collisional damping on the spectra are discussed. The dependence of the main resonance and of the Tonks-Dattner resonances on the physical parameters characterizing the sphere and its surroundings is investigated. The spectrum is shown to be insenitive to the boundary conditions which specify the behaviour of the electrons at the surface of the sphere (author)

  4. Neutron guide geometries for homogeneous phase space volume transformation

    International Nuclear Information System (INIS)

    Stüßer, N.; Bartkowiak, M.; Hofmann, T.

    2014-01-01

    We extend geometries for recently developed optical guide systems that perform homogeneous phase space volume transformations on neutron beams. These modules allow rotating beam directions and can simultaneously compress or expand the beam cross-section. Guide systems combining these modules offer the possibility to optimize ballistic guides with and without direct view on the source and beam splitters. All systems are designed for monochromatic beams with a given divergence. The case of multispectral beams with wavelength-dependent divergence distributions is addressed as well. - Highlights: • Form invariant volume transformation in phase space. • Geometrical approach. • Ballistic guide, beam splitter, beam bender

  5. Neutron guide geometries for homogeneous phase space volume transformation

    Energy Technology Data Exchange (ETDEWEB)

    Stüßer, N., E-mail: stuesser@helmholtz-berlin.de; Bartkowiak, M.; Hofmann, T.

    2014-06-01

    We extend geometries for recently developed optical guide systems that perform homogeneous phase space volume transformations on neutron beams. These modules allow rotating beam directions and can simultaneously compress or expand the beam cross-section. Guide systems combining these modules offer the possibility to optimize ballistic guides with and without direct view on the source and beam splitters. All systems are designed for monochromatic beams with a given divergence. The case of multispectral beams with wavelength-dependent divergence distributions is addressed as well. - Highlights: • Form invariant volume transformation in phase space. • Geometrical approach. • Ballistic guide, beam splitter, beam bender.

  6. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    Science.gov (United States)

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  8. Fully 3D GPU PET reconstruction

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Cal-Gonzalez, J.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2011-01-01

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  9. Fully populated VCM or the hidden parameter

    Directory of Open Access Journals (Sweden)

    Kermarrec G.

    2017-11-01

    Full Text Available Least-squares estimates are trustworthy with minimal variance if the correct stochastic model is used. Due to computational burden, diagonal models that neglect correlations are preferred to describe the elevation dependency of the variance of GPS observations. In this contribution, an improved stochastic model based on a parametric function to take correlations between GPS phase observations into account is presented. Built on an adapted and flexible Mátern function accounting for spatiotemporal variabilities, its parameters can be fixed thanks to Maximum Likelihood Estimation or chosen apriori to model turbulent tropospheric refractivity fluctuations. In this contribution, we will show in which cases and under which conditions corresponding fully populated variance covariance matrices (VCM replace the estimation of a tropospheric parameter. For this equivalence “augmented functional versus augmented stochastic model” to hold, the VCM should be made sufficiently largewhich corresponds to computing small batches of observations. A case study with observations from a medium baseline of 80 km divided into batches of 600 s shows improvement of up to 100 mm for the 3Drms when fully populated VCM are used compared with an elevation dependent diagonal model. It confirms the strong potential of such matrices to improve the least-squares solution, particularly when ambiguities are let float.

  10. Communication: Fully coherent quantum state hopping

    Energy Technology Data Exchange (ETDEWEB)

    Martens, Craig C., E-mail: cmartens@uci.edu [University of California, Irvine, California 92697-2025 (United States)

    2015-10-14

    In this paper, we describe a new and fully coherent stochastic surface hopping method for simulating mixed quantum-classical systems. We illustrate the approach on the simple but unforgiving problem of quantum evolution of a two-state quantum system in the limit of unperturbed pure state dynamics and for dissipative evolution in the presence of both stationary and nonstationary random environments. We formulate our approach in the Liouville representation and describe the density matrix elements by ensembles of trajectories. Population dynamics are represented by stochastic surface hops for trajectories representing diagonal density matrix elements. These are combined with an unconventional coherent stochastic hopping algorithm for trajectories representing off-diagonal quantum coherences. The latter generalizes the binary (0,1) “probability” of a trajectory to be associated with a given state to allow integers that can be negative or greater than unity in magnitude. Unlike existing surface hopping methods, the dynamics of the ensembles are fully entangled, correctly capturing the coherent and nonlocal structure of quantum mechanics.

  11. Topology of actions and homogeneous spaces

    International Nuclear Information System (INIS)

    Kozlov, Konstantin L

    2013-01-01

    Topologization of a group of homeomorphisms and its action provide additional possibilities for studying the topological space, the group of homeomorphisms, and their interconnections. The subject of the paper is the use of the property of d-openness of an action (introduced by Ancel under the name of weak micro-transitivity) in the study of spaces with various forms of homogeneity. It is proved that a d-open action of a Čech-complete group is open. A characterization of Polish SLH spaces using d-openness is given, and it is established that any separable metrizable SLH space has an SLH completion that is a Polish space. Furthermore, the completion is realized in coordination with the completion of the acting group with respect to the two-sided uniformity. A sufficient condition is given for extension of a d-open action to the completion of the space with respect to the maximal equiuniformity with preservation of d-openness. A result of van Mill is generalized, namely, it is proved that any homogeneous CDH metrizable compactum is the only G-compactification of the space of rational numbers for the action of some Polish group. Bibliography: 39 titles.

  12. TWO FERROMAGNETIC SPHERES IN HOMOGENEOUS MAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    Yury A. Krasnitsky

    2018-01-01

    Full Text Available The problem of two spherical conductors is studied quite in detail with bispherical coordinates usage and has numerous appendices in an electrostatics. The boundary-value problem about two ferromagnetic spheres enclosed on homogeneous and infinite environment in which the lack of spheres exists like homogeneous magnetic field is considered. The solution of Laplace's equation in the bispherical system of coordinates allows us to find the potential and field distribution in all spaces, including area between spheres. The boundary conditions in potential continuity and in ordinary density constituent of spheres surfaces induction flux are used. It is supposed that spheres are identical, and magnetic permeability of their material is expressed in  >> 0. The problem about falling of electromagnetic plane wave on the system of two spheres, which possesses electrically small sizes, can be considered as quasistationary. The scalar potentials received as a result of Laplace's equation solution are represented by the series containing Legendre polynomials. The concept of two spheres system effective permeability is introduced. It is equal to the advantage in magnitude of magnetic induction flux vector through a certain system’s section arising due to its magnetic properties. Necessary ratios for the effective permeability referred to the central system’s section are obtained. Particularly, the results can be used during the analysis of ferroxcube core clearance, which influences on the magnetic antenna properties. 

  13. Primary healthcare solo practices: homogeneous or heterogeneous?

    Science.gov (United States)

    Pineault, Raynald; Borgès Da Silva, Roxane; Provost, Sylvie; Beaulieu, Marie-Dominique; Boivin, Antoine; Couture, Audrey; Prud'homme, Alexandre

    2014-01-01

    Introduction. Solo practices have generally been viewed as forming a homogeneous group. However, they may differ on many characteristics. The objective of this paper is to identify different forms of solo practice and to determine the extent to which they are associated with patient experience of care. Methods. Two surveys were carried out in two regions of Quebec in 2010: a telephone survey of 9180 respondents from the general population and a postal survey of 606 primary healthcare (PHC) practices. Data from the two surveys were linked through the respondent's usual source of care. A taxonomy of solo practices was constructed (n = 213), using cluster analysis techniques. Bivariate and multilevel analyses were used to determine the relationship of the taxonomy with patient experience of care. Results. Four models were derived from the taxonomy. Practices in the "resourceful networked" model contrast with those of the "resourceless isolated" model to the extent that the experience of care reported by their patients is more favorable. Conclusion. Solo practice is not a homogeneous group. The four models identified have different organizational features and their patients' experience of care also differs. Some models seem to offer a better organizational potential in the context of current reforms.

  14. Cosmic Ray Hit Detection with Homogenous Structures

    Science.gov (United States)

    Smirnov, O. M.

    Cosmic ray (CR) hits can affect a significant number of pixels both on long-exposure ground-based CCD observations and on the Space Telescope frames. Thus, methods of identifying the damaged pixels are an important part of the data preprocessing for practically any application. The paper presents an implementation of a CR hit detection algorithm based on a homogenous structure (also called cellular automata ), a concept originating in artificial intelligence and dicrete mathematics. Each pixel of the image is represented by a small automaton, which interacts with its neighbors and assumes a distinct state if it ``decides'' that a CR hit is present. On test data, the algorithm has shown a high detection rate (~0.7 ) and a low false alarm rate (frame. A homogenous structure is extremely trainable, which can be very important for processing large batches of data obtained under similar conditions. Training and optimizing issues are discussed, as well as possible other applications of this concept to image processing.

  15. Photo-electret effects in homogenous semiconductors

    International Nuclear Information System (INIS)

    Nabiev, G.A.

    2004-01-01

    In the given work is shown the opportunity and created the theory of photo-electret condition in semiconductors with Dember mechanism of photo-voltage generation. Photo-electret of such type can be created, instead of traditional and without an external field as a result of only one illumination. Polar factor, in this case, is the distinction of electrons and holes mobility. Considered the multilayered structure with homogeneous photoactive micro areas shared by the layers, which are interfering to alignment of carriers concentration. We consider, that the homogeneous photoactive areas contain deep levels of stick. Because of addition of elementary photo voltage in separate micro photo cells it is formed the abnormal-large photo voltage (APV-effect). Let's notice, that Dember photo-voltage in a separate micro photo-cell ≤kT/q. From the received expressions, in practically important, special case, when quasi- balance between valent zone and stick levels established in much more smaller time, than free hole lifetime, and we received, that photo-voltage is relaxing. Comparing of the received expressions with the laws of photo voltage attenuation in p-n- junction structures shows their identity; the difference is only in absolute meanings of photo voltage. During the illumination in the semiconductor are created the superfluous concentration of charge carriers and part from them stays at deep levels. At de-energizing light there is a gradual generation of carriers located at these levels

  16. Irregular Homogeneity Domains in Ternary Intermetallic Systems

    Directory of Open Access Journals (Sweden)

    Jean-Marc Joubert

    2015-12-01

    Full Text Available Ternary intermetallic A–B–C systems sometimes have unexpected behaviors. The present paper examines situations in which there is a tendency to simultaneously form the compounds ABx, ACx and BCx with the same crystal structure. This causes irregular shapes of the phase homogeneity domains and, from a structural point of view, a complete reversal of site occupancies for the B atom when crossing the homogeneity domain. This work reviews previous studies done in the systems Fe–Nb–Zr, Hf–Mo–Re, Hf–Re–W, Mo–Re–Zr, Re–W–Zr, Cr–Mn–Si, Cr–Mo–Re, and Mo–Ni–Re, and involving the topologically close-packed Laves, χ and σ phases. These systems have been studied using ternary isothermal section determination, DFT calculations, site occupancy measurement using joint X-ray, and neutron diffraction Rietveld refinement. Conclusions are drawn concerning this phenomenon. The paper also reports new experimental or calculated data on Co–Cr–Re and Fe–Nb–Zr systems.

  17. WHAMP - waves in homogeneous, anisotropic, multicomponent plasmas

    International Nuclear Information System (INIS)

    Roennmark, K.

    1982-06-01

    In this report, a computer program which solves the dispersion relation of waves in a magnetized plasma is described. The dielectric tensor is derived using the kinetic theory of homogeneous plasmas with Maxwellian velocity distribution. Up to six different plasma components can be included in this version of the program, and each component is specified by its density, temperature, particle mass, anisotropy and drift velocity along the magnetic field. The program is thus applicable to a very wide class of plasmas, and the method should in general be useful whenever a homogeneous magnetized plasma can be approximated by a linear combination of Maxwellian components. The general theory underlying the program is outlined. It is shown that by introducing a Pade approximant for the plasma dispersion function Z, the infinite sums of modified Bessel functions which appear in the dielectric tensor may be reduced to a summable form. The Pade approximant is derived and the accuracy of the approximation is also discussed. The subroutines making up the program are described. (Author)

  18. Homogenization of food samples for gamma spectrometry using tetramethylammonium hydroxide and enzymatic digestion

    International Nuclear Information System (INIS)

    Kimi Nishikawa; Abdul Bari; Abdul Jabbar Khan; Xin Li; Traci Menia; Semkow, T.M.

    2017-01-01

    We have developed a method of food sample preparation for gamma spectrometry involving the use of tetramethylammonium hydroxide (TMAH) and/or enzymes such as α-amylase or cellulase for sample homogenization. We demonstrated the effectiveness of this method using food matrices spiked with "6"0Co, "1"3"1I, "1"3"4","1"3"7Cs, and "2"4"1Am radionuclides, homogenized with TMAH (mixed salad, parmesan cheese, and ground beef); enzymes (α-amylase for bread, and cellulase for baked beans); or α-amylase followed by TMAH (cheeseburgers). Procedures were developed which are best compromises between the degree of homogenization, accuracy, speed, and minimizing laboratory equipment contamination. Based on calculated sample biases and z-scores, our results suggest that homogenization using TMAH and enzymes would be a useful method of sample preparation for gamma spectrometry samples during radiological emergencies. (author)

  19. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    Science.gov (United States)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  20. The fully integrated biomedical engineering programme at Eindhoven University of Technology

    NARCIS (Netherlands)

    Slaaf, D.W.; Genderen, van M.H.P.

    2009-01-01

    The development of a fully integrated biomedical engineering programme (life sciences included from the start) is described. Details are provided about background, implementation, and didactic concept: design centred learning combined with courses. The curriculum has developed into a

  1. Hydrogen Production by Homogeneous Catalysis: Alcohol Acceptorless Dehydrogenation

    DEFF Research Database (Denmark)

    Nielsen, Martin

    2015-01-01

    in hydrogen production from biomass using homogeneous catalysis. Homogeneous catalysis has the advance of generally performing transformations at much milder conditions than traditional heterogeneous catalysis, and hence it constitutes a promising tool for future applications for a sustainable energy sector...

  2. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    International Nuclear Information System (INIS)

    Baranyai, L.

    1983-01-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with 198 Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed. (orig.) [de

  3. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    Energy Technology Data Exchange (ETDEWEB)

    Baranyai, L

    1983-12-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with /sup 198/Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed.

  4. Systematic assembly homogenization and local flux reconstruction for nodal method calculations of fast reactor power distributions

    International Nuclear Information System (INIS)

    Dorning, J.J.

    1991-01-01

    A simultaneous pin lattice cell and fuel bundle homogenization theory has been developed for use with nodal diffusion calculations of practical reactors. The theoretical development of the homogenization theory, which is based on multiple-scales asymptotic expansion methods carried out through fourth order in a small parameter, starts from the transport equation and systematically yields: a cell-homogenized bundled diffusion equation with self-consistent expressions for the cell-homogenized cross sections and diffusion tensor elements; and a bundle-homogenized global reactor diffusion equation with self-consistent expressions for the bundle-homogenized cross sections and diffusion tensor elements. The continuity of the angular flux at cell and bundle interfaces also systematically yields jump conditions for the scaler flux or so-called flux discontinuity factors on the cell and bundle interfaces in terms of the two adjacent cell or bundle eigenfunctions. The expressions required for the reconstruction of the angular flux or the 'de-homogenization' theory were obtained as an integral part of the development; hence the leading order transport theory angular flux is easily reconstructed throughout the reactor including the regions in the interior of the fuel bundles or computational nodes and in the interiors of the pin lattice cells. The theoretical development shows that the exact transport theory angular flux is obtained to first order from the whole-reactor nodal diffusion calculations, done using the homogenized nuclear data and discontinuity factors, is a product of three computed quantities: a ''cell shape function''; a ''bundle shape function''; and a ''global shape function''. 10 refs

  5. Fully Printed Flexible and Stretchable Electronics

    Science.gov (United States)

    Zhang, Suoming

    Through this thesis proposal, the author has demonstrated series of flexible or stretchable sensors including strain gauge, pressure sensors, display arrays, thin film transistors and photodetectors fabricated by a direct printing process. By adopting the novel serpentine configuration with conventional non-stretchable materials silver nanoparticles, the fully printed stretchable devices are successfully fabricated on elastomeric substrate with the demonstration of stretchable conductors that can maintain the electrical properties under strain and the strain gauge, which could be used to measure the strain in desired locations and also to monitor individual person's finger motion. And by investigating the intrinsic stretchable materials silver nanowires (AgNWs) with the conventional configuration, the fully printed stretchable conductors are achieved on various substrates including Si, glass, Polyimide, Polydimethylsiloxane (PDMS) and Very High Bond (VHB) tape with the illustration of the capacitive pressure sensor and stretchable electroluminescent displays. In addition, intrinsically stretchable thin-film transistors (TFTs) and integrated logic circuits are directly printed on elastomeric PDMS substrates. The printed devices utilize carbon nanotubes and a type of hybrid gate dielectric comprising PDMS and barium titanate (BaTiO3) nanoparticles. The BaTiO3/PDMS composite simultaneously provides high dielectric constant, superior stretchability, low leakage, as well as good printability and compatibility with the elastomeric substrate. Both TFTs and logic circuits can be stretched beyond 50% strain along either channel length or channel width directions for thousands of cycles while showing no significant degradation in electrical performance. Finally, by applying the SWNTs as the channel layer of the thin film transistor, we successfully fabricate the fully printed flexible photodetector which exhibits good electrical characteristics and the transistors exhibit

  6. Layered Fiberconcrete with Non-Homogeneous Fibers Distribution

    OpenAIRE

    Lūsis, V; Krasņikovs, A

    2013-01-01

    The aim of present research is to create fiberconcrete construction with non-homogeneous fibers distribution in it. Traditionally fibers are homogeneously dispersed in a concrete. At the same time in many situations fiberconcretes with homogeneously dispersed fibers are not optimal (majority of added fibers are not participating in a loads bearing process).

  7. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with

  8. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    Introduction: Perfusion- and diffusion weighted MRI (PWI/DWI) is widely used to select patients who are likely to benefit from recanalization therapy. The visual identification of PWI-DWI-mismatch tissue depends strongly on the observer, prompting a need for software, which estimates potentially...... salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions......) at 600∙10-6 mm2/sec. Due to the nature of thresholding, the ADC mask overestimates the DWI lesion volume and consequently we initialized level-set algorithm on DWI image with ADC mask as prior knowledge. Combining the PWI and inverted DWI mask then yield the PWI-DWI mismatch mask. Four expert raters...

  9. How the Spectre of Societal Homogeneity Undermines Equitable Healthcare for Refugees

    Science.gov (United States)

    Razum, Oliver; Wenner, Judith; Bozorgmehr, Kayvan

    2017-01-01

    Recourse to a purported ideal of societal homogeneity has become common in the context of the refugee reception crisis – not only in Japan, as Leppold et al report, but also throughout Europe. Calls for societal homogeneity in Europe originate from populist movements as well as from some governments. Often, they go along with reduced social support for refugees and asylum seekers, for example in healthcare provision. The fundamental right to health is then reduced to a citizens’ right, granted fully only to nationals. Germany, in spite of welcoming many refugees in 2015, is a case in point: entitlement and access to healthcare for asylum seekers are restricted during the first 15 months of their stay. We show that arguments brought forward to defend such restrictions do not hold, particularly not those which relate to maintaining societal homogeneity. European societies are not homogeneous, irrespective of migration. But as migration will continue, societies need to invest in what we call "globalization within." Removing entitlement restrictions and access barriers to healthcare for refugees and asylum seekers is one important element thereof. PMID:28812828

  10. Fully implicit 1D radiation hydrodynamics: Validation and verification

    International Nuclear Information System (INIS)

    Ghosh, Karabi; Menon, S.V.G.

    2010-01-01

    A fully implicit finite difference scheme has been developed to solve the hydrodynamic equations coupled with radiation transport. Solution of the time-dependent radiation transport equation is obtained using the discrete ordinates method and the energy flow into the Lagrangian meshes as a result of radiation interaction is fully accounted for. A tridiagonal matrix system is solved at each time step to determine the hydrodynamic variables implicitly. The results obtained from this fully implicit radiation hydrodynamics code in the planar geometry agrees well with the scaling law for radiation driven strong shock propagation in aluminium. For the point explosion problem the self similar solutions are compared with results for pure hydrodynamic case in spherical geometry. Results obtained when radiation interaction is also accounted agree with those of point explosion with heat conduction for lower input energies. Having, thus, benchmarked the code, self convergence of the method w.r.t. time step is studied in detail for both the planar and spherical problems. Spatial as well as temporal convergence rates are ≅1 as expected from the difference forms of mass, momentum and energy conservation equations. This shows that the asymptotic convergence rate of the code is realized properly.

  11. Fully integrated digital GAMMA camera-computer system

    International Nuclear Information System (INIS)

    Berger, H.J.; Eisner, R.L.; Gober, A.; Plankey, M.; Fajman, W.

    1985-01-01

    Although most of the new non-nuclear imaging techniques are fully digital, there has been a reluctance in nuclear medicine to abandon traditional analog planar imaging in favor of digital acquisition and display. The authors evaluated a prototype digital camera system (GE STARCAM) in which all of the analog acquisition components are replaced by microprocessor controls and digital circuitry. To compare the relative effects of acquisition matrix size on image quality and to ascertain whether digital techniques could be used in place of analog imaging, Tc-99m bone scans were obtained on this digital system and on a comparable analog camera in 10 patients. The dedicated computer is used for camera setup including definition of the energy window, spatial energy correction, and spatial distortion correction. The display monitor, which is used for patient positioning and image analysis, is 512/sup 2/ non-interlaced, allowing high resolution imaging. Data acquisition and processing can be performed simultaneously. Thus, the development of a fully integrated digital camera-computer system with optimized display should allow routine utilization of non-analog studies in nuclear medicine and the ultimate establishment of fully digital nuclear imaging laboratories

  12. Lidar Cloud Detection with Fully Convolutional Networks

    Science.gov (United States)

    Cromwell, E.; Flynn, D.

    2017-12-01

    The vertical distribution of clouds from active remote sensing instrumentation is a widely used data product from global atmospheric measuring sites. The presence of clouds can be expressed as a binary cloud mask and is a primary input for climate modeling efforts and cloud formation studies. Current cloud detection algorithms producing these masks do not accurately identify the cloud boundaries and tend to oversample or over-represent the cloud. This translates as uncertainty for assessing the radiative impact of clouds and tracking changes in cloud climatologies. The Atmospheric Radiation Measurement (ARM) program has over 20 years of micro-pulse lidar (MPL) and High Spectral Resolution Lidar (HSRL) instrument data and companion automated cloud mask product at the mid-latitude Southern Great Plains (SGP) and the polar North Slope of Alaska (NSA) atmospheric observatory. Using this data, we train a fully convolutional network (FCN) with semi-supervised learning to segment lidar imagery into geometric time-height cloud locations for the SGP site and MPL instrument. We then use transfer learning to train a FCN for (1) the MPL instrument at the NSA site and (2) for the HSRL. In our semi-supervised approach, we pre-train the classification layers of the FCN with weakly labeled lidar data. Then, we facilitate end-to-end unsupervised pre-training and transition to fully supervised learning with ground truth labeled data. Our goal is to improve the cloud mask accuracy and precision for the MPL instrument to 95% and 80%, respectively, compared to the current cloud mask algorithms of 89% and 50%. For the transfer learning based FCN for the HSRL instrument, our goal is to achieve a cloud mask accuracy of 90% and a precision of 80%.

  13. Microstructural evolution during the homogenization heat treatment of 6XXX and 7XXX aluminum alloys

    Science.gov (United States)

    Priya, Pikee

    Homogenization heat treatment of as-cast billets is an important step in the processing of aluminum extrusions. Microstructural evolution during homogenization involves elimination of the eutectic morphology by spheroidisation of the interdendritic phases, minimization of the microsegregation across the grains through diffusion, dissolution of the low-melting phases, which enhances the surface finish of the extrusions, and precipitation of nano-sized dispersoids (for Cr-, Zr-, Mn-, Sc-containing alloys), which inhibit grain boundary motion to prevent recrystallization. Post-homogenization cooling reprecipitates some of the phases, changing the flow stress required for subsequent extrusion. These precipitates, however, are deleterious for the mechanical properties of the alloy and also hamper the age-hardenability and are hence dissolved during solution heat treatment. Microstructural development during homogenization and subsequent cooling occurs both at the length scale of the Secondary Dendrite Arm Spacing (SDAS) in micrometers and dispersoids in nanometers. Numerical tools to simulate microstructural development at both the length scales have been developed and validated against experiments. These tools provide easy and convenient means to study the process. A Cellular Automaton-Finite Volume-based model for evolution of interdendritic phases is coupled with a Particle Size Distribution-based model for precipitation of dispersoids across the grain. This comprehensive model has been used to study the effect of temperature, composition, as-cast microstructure, and cooling rates during post-homogenization quenching on microstructural evolution. The numerical study has been complimented with experiments involving Scanning Electron Microscopy, Energy Dispersive Spectroscopy, X-Ray Diffraction and Differential Scanning Calorimetry and a good agreement has with numerical results has been found. The current work aims to study the microstructural evolution during

  14. The structure and homogeneity of Psalm 32

    Directory of Open Access Journals (Sweden)

    J. Henk Potgieter

    2014-11-01

    Full Text Available Psalm 32 is widely regarded as a psalm of thanksgiving with elements of wisdom poetry intermingled into it. The wisdom elements are variously explained as having been present from the beginning, or as having been added to a foundational composition. Such views of the Gattung have had a decisive influence on the interpretation of the psalm. This article argues, on the basis of a structural analysis, that Psalm 32 should be understood as a homogeneous wisdom composition. The parallel and inverse structure of its two stanzas demonstrate that the aim of its author was to encourage the upright to foster an open, intimate relationship with Yahweh in which transgressions are confessed and Yahweh’s benevolent guidance on the way of life is wisely accepted.

  15. Precipitation of plutonium oxalate from homogeneous solutions

    International Nuclear Information System (INIS)

    Rao, V.K.; Pius, I.C.; Subbarao, M.; Chinnusamy, A.; Natarajan, P.R.

    1986-01-01

    A method for the precipitation of plutonium(IV) oxalate from homogeneous solutions using diethyl oxalate is reported. The precipitate obtained is crystalline and easily filterable with yields in the range of 92-98% for precipitations involving a few mg to g quantities of plutonium. Decontamination factors for common impurities such as U(VI), Am(III) and Fe(III) were determined. TGA and chemical analysis of the compound indicate its composition as Pu(Csub(2)Osub(4))sub(2).6Hsub(2)O. Data are obtained on the solubility of the oxalate in nitric acid and in mixtures of nitric acid and oxalic acid of varying concentrations. Green PuOsub(2) obtained by calcination of the oxalate has specifications within the recommended values for trace foreign substances such as chlorine, fluorine, carbon and nitrogen. (author)

  16. Modelling of an homogeneous equilibrium mixture model

    International Nuclear Information System (INIS)

    Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.

    2014-01-01

    We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)

  17. Development of a fully injectable calcium phosphate cement for ...

    Indian Academy of Sciences (India)

    Unknown

    2003-01-27

    Jan 27, 2003 ... excellent alloplastic material for osseous augmentation because of the ... and basic calcium phosphate compounds on wetting with an aqueous ... ment of acute fracture of the radius through percutaneous administration of ...

  18. Development of Fully-Integrated Micromagnetic Actuator Technologies

    Science.gov (United States)

    2015-07-13

    electromechanical actuation schemes are ubiquitous in macroscale systems such as audio speakers , relays, solenoids, and electrical motors. However...as audio speakers , relays, solenoids, and electrical motors. However, implementation of these transduction schemes at the microscale is nearly...the performance of electrodynamic vibrational energy harvesters, Smart Materials and Structures, (02 2013): 0. doi: 10.1088/0964-1726/22/2/025005

  19. Two-scale analysis of intermittency in fully developed turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Badii, R; Talkner, P [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    A self-affinity test for turbulent time series is applied to experimental data for the estimation of intermittency exponents. The method employs exact relations satisfied by joint expectations of observables computed across two different length scales. One of these constitutes a verification tool for the existence and the extent of the inertial range. (author) 2 figs., 13 refs.

  20. Fully developed magnetohydrodynamic flows in rectangular ducts with insulating walls

    International Nuclear Information System (INIS)

    Molokov, S.; Kernforschungszentrum Karlsruhe GmbH; Shishko, A.

    1993-10-01

    In the first part the effect of magnetic field inclination on the flow structure and the pressure drop is considered. The duct walls are insulating. An asymptotic solution to the problem at high Hartmann numbers is obtained. The results show that for a square duct the increase of the pressure gradient due to the field inclination is negligible (less than 10% for any angle). For blanket relevant values of inclination of up to 10 the deviation of the velocity profile from the slug profile is insignificant. The second part studies the flow in a duct with insulating walls parallel to the magnetic field, while the Hartmann walls are covered by an insulating coating. A new type of the boundary condition is derived, which takes into account finite coating resistance. The effect of the latter on the flow characteristics is studied. An exact solution to the problem is obtained and several approximate formulas for the pressure drop at high Hartmann numbers are presented. (orig./HP) [de

  1. Behaviour of organised disturbances in fully developed turbulent ...

    Indian Academy of Sciences (India)

    Here we wish to study the evolution of organised disturbances in turbulent channel .... Equation (12) may also be written in inner variables using the friction velocity v├ .... This is in qualitative agreement with the other two curves but, as is to be.

  2. Homogeneous wave turbulence driven by tidal flows

    Science.gov (United States)

    Favier, B.; Le Reun, T.; Barker, A.; Le Bars, M.

    2017-12-01

    When a moon orbits around a planet, the rotation of the induced tidal bulge drives a homogeneous, periodic, large-scale flow. The combination of such an excitation with the rotating motion of the planet has been shown to drive parametric resonance of a pair of inertial waves in a mechanism called the elliptical instability. Geophysical fluid layers can also be stratified: this is the case for instance of the Earth's oceans and, as suggested by several studies, of the upper part of the Earth's liquid Outer Core. We thus investigate the stability of a rotating and stratified layer undergoing tidal distortion in the limit where either rotation or stratification is dominant. We show that the periodic tidal flow drives a parametric subharmonic resonance of inertial (resp. internal) waves in the rotating (resp. stratified) case. The instability saturates into a wave turbulence pervading the whole fluid layer. In such a state, the instability mechanism conveys the tidal energy from the large scale tidal flow to the resonant modes, which then feed a succession of triadic resonances also generating small spatial scales. In the rotating case, we observe a kinetic energy spectrum with a k-2 slope for which the Coriolis force is dominant at all spatial scales. In the stratified case, where the timescale separation is increased between the tidal excitation and the Brunt-Väisälä frequencies, the temporal spectrum decays with a ω-2 power law up to the cut-off frequency beyond which waves do not exist. This result is reminiscent of the Garrett and Munk spectrum measured in the oceans and theoretically described as a manifestation of internal wave turbulence. In addition to revealing an instability driving homogeneous turbulence in geophysical fluid layers, our approach is also an efficient numerical tool to investigate the possibly universal properties of wave turbulence in a geophysical context.

  3. Conformally compactified homogeneous spaces (Possible Observable Consequences)

    International Nuclear Information System (INIS)

    Budinich, P.

    1995-01-01

    Some arguments based on the possible spontaneous violation of the Cosmological Principles (represented by the observed large-scale structures of galaxies), the Cartan-geometry of simple spinors and on the Fock-formulation of hydrogen-atom wave-equation in momentum-space, are presented in favour of the hypothesis that space-time and momentum-space should be both conformally compactified and represented by the two four-dimensional homogeneous spaces of the conformal group, both isomorphic to (S 3 X S 1 )/Z 2 and correlated by conformal inversion. Within this framework, the possible common origin for the S0(4) symmetry underlying the geometrical structure of the Universe, of Kepler orbits and of the H-atom is discussed. On of the consequences of the proposed hypothesis could be that any quantum field theory should be naturally free from both infrared and ultraviolet divergences. But then physical spaces defined as those where physical phenomena may be best described, could be different from those homogeneous spaces. A simple, exactly soluble, toy model, valid for a two-dimensional space-time is presented where the conjecture conformally compactified space-time and momentum-space are both isomorphic to (S 1 X S 1 )/Z 2 , while the physical spaces are two finite lattice which are dual since Fourier transforms, represented by finite, discrete, sums may be well defined on them. Furthermore, a q-deformed SU q (1,1) may be represented on them if q is a root of unity. (author). 22 refs, 3 figs

  4. Value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations

    Directory of Open Access Journals (Sweden)

    Luo Li-Qin

    2016-01-01

    Full Text Available In this paper, we investigate the value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations, and obtain the results on the relations between the order of the solutions and the convergence exponents of the zeros, poles, a-points and small function value points of the solutions, which show the relations in the case of non-homogeneous equations are sharper than the ones in the case of homogeneous equations.

  5. Technical Note: Homogeneity of Gafchromic EBT2 film

    International Nuclear Information System (INIS)

    Hartmann, Bernadette; Martisikova, Maria; Jaekel, Oliver

    2010-01-01

    Purpose: The self-developing Gafchromic EBT film is a radiochromic film, widely used for relative photon dosimetry. Recently, the manufacturer has replaced the well-investigated EBT film by the new Gafchromic EBT2 film. It has the same sensitive component and, in addition, it contains a yellow marker dye in order to protect the film against ambient light exposure and to serve as a base for corrections of small differences in film response. Furthermore, the configuration of the film layers as well as the binder material have been changed in comparison to the EBT film. When investigating the properties of EBT2 film, all characteristics were found to be similar to those of EBT film, except for the film response homogeneity. Thus, in this article special focus was put on examining the homogeneity of EBT2 film. Methods: A scan protocol established for EBT film and published previously was used. The uniformity of the film coloration was investigated for unirradiated and irradiated EBT2 film sheets. The dose response of EBT2 film was measured and the influence of film inhomogeneities on dose determination was evaluated. Results: Inhomogeneities in pixel values of up to ±3.7% within one film were detected. The relative inhomogeneities were found to be approximately independent of the dose. Nonuniformities of the film response lead to uncertainties in dose determination of ±8.7% at 1 Gy. When using net optical densities for dose calibration, uncertainties in dose determination amount to more than ±6%. Conclusions: EBT2 films from the lot investigated in this study show response inhomogeneities, which lead to uncertainties in dose determination exceeding the commonly accepted tolerance levels. It is important to test further EBT2 lots regarding homogeneity before using the film in clinical routine.

  6. Comparison of different homogenization approaches for elastic–viscoplastic materials

    International Nuclear Information System (INIS)

    Mercier, S; Molinari, A; Berbenni, S; Berveiller, M

    2012-01-01

    Homogenization of linear viscoelastic and non-linear viscoplastic composite materials is considered in this paper. First, we compare two homogenization schemes based on the Mori–Tanaka method coupled with the additive interaction (AI) law proposed by Molinari et al (1997 Mech. Mater. 26 43–62) or coupled with a concentration law based on translated fields (TF) originally proposed for the self-consistent scheme by Paquin et al (1999 Arch. Appl. Mech. 69 14–35). These methods are also evaluated against (i) full-field calculations of the literature based on the finite element method and on fast Fourier transform, (ii) available analytical exact solutions obtained in linear viscoelasticity and (iii) homogenization methods based on variational approaches. Developments of the AI model are obtained for linear and non-linear material responses while results for the TF method are shown for the linear case. Various configurations are considered: spherical inclusions, aligned fibers, hard and soft inclusions, large material contrasts between phases, volume-preserving versus dilatant anelastic flow, non-monotonic loading. The agreement between the AI and TF methods is excellent and the correlation with full field calculations is in general of quite good quality (with some exceptions for non-linear composites with a large volume fraction of very soft inclusions for which a discrepancy of about 15% was found for macroscopic stress). Description of the material behavior with internal variables can be accounted for with the AI and TF approaches and therefore complex loadings can be easily handled in contrast with most hereditary approaches. (paper)

  7. Homogenization of Doppler broadening in spin-noise spectroscopy

    Science.gov (United States)

    Petrov, M. Yu.; Ryzhov, I. I.; Smirnov, D. S.; Belyaev, L. Yu.; Potekhin, R. A.; Glazov, M. M.; Kulyasov, V. N.; Kozlov, G. G.; Aleksandrov, E. B.; Zapasskii, V. S.

    2018-03-01

    The spin-noise spectroscopy, being a nonperturbative linear optics tool, is still reputed to reveal a number of capabilities specific to nonlinear optics techniques. The effect of the Doppler broadening homogenization discovered in this work essentially widens these unique properties of spin-noise spectroscopy. We investigate spin noise of a classical system—cesium atoms vapor with admixture of buffer gas—by measuring the spin-induced Faraday rotation fluctuations in the region of D 2 line. The line, under our experimental conditions, is strongly inhomogeneously broadened due to the Doppler effect. Despite that, optical spectrum of the spin-noise power has the shape typical for the homogeneously broadened line with a dip at the line center. This fact is in stark contrast with the results of previous studies of inhomogeneous quantum dot ensembles and Doppler broadened atomic systems. In addition, the two-color spin-noise measurements have shown, in a highly spectacular way, that fluctuations of the Faraday rotation within the line are either correlated or anticorrelated depending on whether the two wavelengths lie on the same side or on different sides of the resonance. The experimental data are interpreted in the frame of the developed theoretical model which takes into account both kinetics and spin dynamics of Cs atoms. It is shown that the unexpected behavior of the Faraday rotation noise spectra and effective homogenization of the optical transition in the spin-noise measurements are related to smallness of the momentum relaxation time of the atoms as compared with their spin-relaxation time. Our findings demonstrate abilities of spin-noise spectroscopy for studying dynamic properties of inhomogeneously broadened ensembles of randomly moving spins.

  8. Homogeneous slowpoke reactor for the production of radio-isotope. A feasibility study

    International Nuclear Information System (INIS)

    Busatta, P.; Bonin, H.

    2005-01-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP 5 simulation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether the natural convection will still effectively cool the reactor using the modeling software FEMLAB. The MCNP 5 simulation code was validated by using a simulation with WIMS-AECL code. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  9. Homogeneous Slowpoke reactor for the production of radio-isotope: a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busetta, P.; Bonin, H.W. [Royal Military College of Canada, Kingston, Ontario (Canada)

    2006-09-15

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous react will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB(r). It was found that it is needed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  10. Homogeneous slowpoke reactor for the production of radio-isotope. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busatta, P.; Bonin, H. [Royal Military College of Canada, Kingston, Ontario (Canada)]. E-mail: paul.busatta@rmc.ca; bonin-h@rmc.ca

    2005-07-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP 5 simulation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether the natural convection will still effectively cool the reactor using the modeling software FEMLAB. The MCNP 5 simulation code was validated by using a simulation with WIMS-AECL code. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  11. Homogeneous SLOWPOKE reactor for the production of radio-isotope. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Busatta, P.; Bonin, H.W. [Royal Military College of Canada, Kingston, Ontario (Canada)]. E-mail: paul.busatta@rmc.ca; bonin-h@rmc.ca

    2006-07-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  12. Homogeneous SLOWPOKE reactor for the production of radio-isotope. A feasibility study

    International Nuclear Information System (INIS)

    Busatta, P.; Bonin, H.W.

    2006-01-01

    The purpose of this research is to study the feasibility of replacing the actual heterogeneous fuel core of the present SLOWPOKE-2 by a reservoir containing a homogeneous fuel for the production of Mo-99. The study looked at three items: by using the MCNP Monte Carlo reactor calculation code, develop a series of parameters required for an homogeneous fuel and evaluate the uranyl sulfate concentration of the aqueous solution fuel in order to keep a similar excess reactivity; verify if the homogeneous reactor will retain its inherent safety attributes; and with the new dimensions and geometry of the fuel core, observe whether natural convection can still effectively cool the reactor using the modeling software FEMLAB. It was found that it is indeed feasible to modify the SLOWPOKE-2 reactor for a homogeneous reactor using a solution of uranyl sulfate and water. (author)

  13. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    Science.gov (United States)

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  14. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  15. Dissolution test for homogeneity of mixed oxide fuel pellets

    International Nuclear Information System (INIS)

    Lerch, R.E.

    1979-08-01

    Experiments were performed to determine the relationship between fuel pellet homogeneity and pellet dissolubility. Although, in general, the amount of pellet residue decreased with increased homogeneity, as measured by the pellet figure of merit, the relationship was not absolute. Thus, all pellets with high figure of merit (excellent homogeneity) do not necessarily dissolve completely and all samples that dissolve completely do not necessarily have excellent homogeneity. It was therefore concluded that pellet dissolubility measurements could not be substituted for figure of merit determinations as a measurement of pellet homogeneity. 8 figures, 3 tables

  16. A fully automated temperature-dependent resistance measurement setup using van der Pauw method

    Science.gov (United States)

    Pandey, Shivendra Kumar; Manivannan, Anbarasu

    2018-03-01

    The van der Pauw (VDP) method is widely used to identify the resistance of planar homogeneous samples with four contacts placed on its periphery. We have developed a fully automated thin film resistance measurement setup using the VDP method with the capability of precisely measuring a wide range of thin film resistances from few mΩ up to 10 GΩ under controlled temperatures from room-temperature up to 600 °C. The setup utilizes a robust, custom-designed switching network board (SNB) for measuring current-voltage characteristics automatically at four different source-measure configurations based on the VDP method. Moreover, SNB is connected with low noise shielded coaxial cables that reduce the effect of leakage current as well as the capacitance in the circuit thereby enhancing the accuracy of measurement. In order to enable precise and accurate resistance measurement of the sample, wide range of sourcing currents/voltages are pre-determined with the capability of auto-tuning for ˜12 orders of variation in the resistances. Furthermore, the setup has been calibrated with standard samples and also employed to investigate temperature dependent resistance (few Ω-10 GΩ) measurements for various chalcogenide based phase change thin films (Ge2Sb2Te5, Ag5In5Sb60Te30, and In3SbTe2). This setup would be highly helpful for measurement of temperature-dependent resistance of wide range of materials, i.e., metals, semiconductors, and insulators illuminating information about structural change upon temperature as reflected by change in resistances, which are useful for numerous applications.

  17. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    Science.gov (United States)

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  18. The Latin-American region and the challenges to develop one homogeneous and harmonized hazard model: preliminary results for the Caribbean and Central America regions in the GEM context

    Science.gov (United States)

    Garcia, J.; Arcila, M.; Benito, B.; Eraso, J.; García, R.; Gomez Capera, A.; Pagani, M.; Pinho, R.; Rendon, H.; Torres, Y.

    2013-05-01

    Latin America is a seismically active region with complex tectonic settings that make the creation of hazard models challenging. Over the past two decades PSHA studies have been completed for this region in the context of global (Shedlock, 1999), regional (Dimaté et al., 1999) and national initiatives. Currently different research groups are developing new models for various nations. The Global Earthquake Model (GEM), an initiative aiming at the creation of a large global community working collaboratively on building hazard and risk models using open standards and tools, is promoting the collaboration between different national projects and groups so as to facilitate the creation of harmonized regional models. The creation of a harmonized hazard model can follow different approaches, varying from a simple patching of available models to a complete homogenisation of basic information and the subsequent creation of a completely new PSHA model. In this contribution we describe the process and results of a first attempt aiming at the creation of a community based model covering the Caribbean and Central America regions. It consists of five main steps: 1- Identification and collection of available PSHA input models; 2- Analysis of the consistency, transparency and reproducibility of each model; 3- Selection (if more then a model exists for the same region); 4- Representation of the models in a standardized format and incorporation of new knowledge from recent studies; 5- Proposal(s) of harmonization We consider some PHSA studies completed over the latest twenty years in the region comprising the Caribbean (CAR), Central America (CAM) and northern South America (SA), we illustrate a tentative harmonization of the seismic source geometries models and we discuss the steps needed toward a complete harmonisation of the models. Our will is to have a model based on best practices and high standards created though a combination of knowledge and competences coming from the

  19. A fully adaptive hybrid optimization of aircraft engine blades

    Science.gov (United States)

    Dumas, L.; Druez, B.; Lecerf, N.

    2009-10-01

    A new fully adaptive hybrid optimization method (AHM) has been developed and applied to an industrial problem in the field of the aircraft engine industry. The adaptivity of the coupling between a global search by a population-based method (Genetic Algorithms or Evolution Strategies) and the local search by a descent method has been particularly emphasized. On various analytical test cases, the AHM method overperforms the original global search method in terms of computational time and accuracy. The results obtained on the industrial case have also confirmed the interest of AHM for the design of new and original solutions in an affordable time.

  20. Uniformity of fully gravure printed organic field-effect transistors

    International Nuclear Information System (INIS)

    Hambsch, M.; Reuter, K.; Stanel, M.; Schmidt, G.; Kempa, H.; Fuegmann, U.; Hahn, U.; Huebler, A.C.

    2010-01-01

    Fully mass-printed organic field-effect transistors were made completely by means of gravure printing. Therefore a special printing layout was developed in order to avoid register problems in print direction. Upon using this layout, contact pads for source-drain electrodes of the transistors are printed together with the gate electrodes in one and the same printing run. More than 50,000 transistors have been produced and by random tests a yield of approximately 75% has been determined. The principle suitability of the gravure printed transistors for integrated circuits has been shown by the realization of ring oscillators.

  1. A fully robust PARAFAC method for analyzing fluorescence data

    DEFF Research Database (Denmark)

    Engelen, Sanne; Frosch, Stina; Jørgensen, Bo

    2009-01-01

    and Rayleigh scatter. Recently, a robust PARAFAC method that circumvents the harmful effects of outlying samples has been developed. For removing the scatter effects on the final PARAFAC model, different techniques exist. Newly, an automated scatter identification tool has been constructed. However......, there still exists no robust method for handling fluorescence data encountering both outlying EEM landscapes and scatter. In this paper, we present an iterative algorithm where the robust PARAFAC method and the scatter identification tool are alternately performed. A fully automated robust PARAFAC method...

  2. Moving towards a Competitive Fully Enzymatic Biodiesel Process

    Directory of Open Access Journals (Sweden)

    Silvia Cesarini

    2015-06-01

    Full Text Available Enzymatic biodiesel synthesis can solve several problems posed by the alkaline-catalyzed transesterification but it has the drawback of being too expensive to be considered competitive. Costs can be reduced by lipase improvement, use of unrefined oils, evaluation of soluble/immobilized lipase preparations, and by combination of phospholipases with a soluble lipase for biodiesel production in a single step. As shown here, convenient natural tools have been developed that allow synthesis of high quality FAMEs (EN14214 from unrefined oils in a completely enzymatic single-step process, making it fully competitive.

  3. On the critical point of the fully-anisotropic quenched bond-random Potts ferromagnet in triangular and honeycomb lattices

    International Nuclear Information System (INIS)

    Tsallis, C.; Santos, R.J.V. dos

    1983-01-01

    On conjectural grounds an equation that provides a very good approximation for the critical temperature of the fully-anisotropic homogeneous quenched bond-random q-state Potts ferromagnet in triangular and honeycomb lattices is presented. Almost all the exact particular results presently known for the square, triangular and honeycomb lattices are recovered; the numerical discrepancy is quite small for the few exceptions. Some predictions that we believe to be exact are made explicite as well. (Author) [pt

  4. Fully inkjet-printed microwave passive electronics

    KAUST Repository

    McKerricher, Garret

    2017-01-30

    Fully inkjet-printed three-dimensional (3D) objects with integrated metal provide exciting possibilities for on-demand fabrication of radio frequency electronics such as inductors, capacitors, and filters. To date, there have been several reports of printed radio frequency components metallized via the use of plating solutions, sputtering, and low-conductivity pastes. These metallization techniques require rather complex fabrication, and do not provide an easily integrated or versatile process. This work utilizes a novel silver ink cured with a low-cost infrared lamp at only 80 °C, and achieves a high conductivity of 1×107 S m−1. By inkjet printing the infrared-cured silver together with a commercial 3D inkjet ultraviolet-cured acrylic dielectric, a multilayer process is demonstrated. By using a smoothing technique, both the conductive ink and dielectric provide surface roughness values of <500 nm. A radio frequency inductor and capacitor exhibit state-of-the-art quality factors of 8 and 20, respectively, and match well with electromagnetic simulations. These components are implemented in a lumped element radio frequency filter with an impressive insertion loss of 0.8 dB at 1 GHz, proving the utility of the process for sensitive radio frequency applications.

  5. The first LHC sector is fully interconnected

    CERN Multimedia

    2006-01-01

    Sector 7-8 is the first sector of the LHC to become fully operational. All the magnets, cryogenic line, vacuum chambers and services are interconnected. The cool down of this sector can soon commence. LHC project leader Lyn Evans, the teams from CERN's AT/MCS, AT/VAC and AT/MEL groups, and the members of the IEG consortium celebrate the completion of the first LHC sector. The 10th of November was a red letter day for the LHC accelerator teams, marking the completion of the first sector of the machine. The magnets of sector 7-8, together with the cryogenic line, the vacuum chambers and the distribution feedboxes (DFBs) are now all completely interconnected. Sector 7-8 has thus been closed and is the first LHC sector to become operational. The interconnection work required several thousand electrical, cryogenic and insulating connections to be made on the 210 interfaces between the magnets in the arc, the 30 interfaces between the special magnets and the interfaces with the cryogenic line. 'This represent...

  6. Fully Resolved Simulations of 3D Printing

    Science.gov (United States)

    Tryggvason, Gretar; Xia, Huanxiong; Lu, Jiacai

    2017-11-01

    Numerical simulations of Fused Deposition Modeling (FDM) (or Fused Filament Fabrication) where a filament of hot, viscous polymer is deposited to ``print'' a three-dimensional object, layer by layer, are presented. A finite volume/front tracking method is used to follow the injection, cooling, solidification and shrinking of the filament. The injection of the hot melt is modeled using a volume source, combined with a nozzle, modeled as an immersed boundary, that follows a prescribed trajectory. The viscosity of the melt depends on the temperature and the shear rate and the polymer becomes immobile as its viscosity increases. As the polymer solidifies, the stress is found by assuming a hyperelastic constitutive equation. The method is described and its accuracy and convergence properties are tested by grid refinement studies for a simple setup involving two short filaments, one on top of the other. The effect of the various injection parameters, such as nozzle velocity and injection velocity are briefly examined and the applicability of the approach to simulate the construction of simple multilayer objects is shown. The role of fully resolved simulations for additive manufacturing and their use for novel processes and as the ``ground truth'' for reduced order models is discussed.

  7. Microaneurysm detection using fully convolutional neural networks.

    Science.gov (United States)

    Chudzik, Piotr; Majumdar, Somshubra; Calivá, Francesco; Al-Diri, Bashir; Hunter, Andrew

    2018-05-01

    Diabetic retinopathy is a microvascular complication of diabetes that can lead to sight loss if treated not early enough. Microaneurysms are the earliest clinical signs of diabetic retinopathy. This paper presents an automatic method for detecting microaneurysms in fundus photographies. A novel patch-based fully convolutional neural network with batch normalization layers and Dice loss function is proposed. Compared to other methods that require up to five processing stages, it requires only three. Furthermore, to the best of the authors' knowledge, this is the first paper that shows how to successfully transfer knowledge between datasets in the microaneurysm detection domain. The proposed method was evaluated using three publicly available and widely used datasets: E-Ophtha, DIARETDB1, and ROC. It achieved better results than state-of-the-art methods using the FROC metric. The proposed algorithm accomplished highest sensitivities for low false positive rates, which is particularly important for screening purposes. Performance, simplicity, and robustness of the proposed method demonstrates its suitability for diabetic retinopathy screening applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Fully inkjet-printed microwave passive electronics

    KAUST Repository

    McKerricher, Garret; Vaseem, Mohammad; Shamim, Atif

    2017-01-01

    Fully inkjet-printed three-dimensional (3D) objects with integrated metal provide exciting possibilities for on-demand fabrication of radio frequency electronics such as inductors, capacitors, and filters. To date, there have been several reports of printed radio frequency components metallized via the use of plating solutions, sputtering, and low-conductivity pastes. These metallization techniques require rather complex fabrication, and do not provide an easily integrated or versatile process. This work utilizes a novel silver ink cured with a low-cost infrared lamp at only 80 °C, and achieves a high conductivity of 1×107 S m−1. By inkjet printing the infrared-cured silver together with a commercial 3D inkjet ultraviolet-cured acrylic dielectric, a multilayer process is demonstrated. By using a smoothing technique, both the conductive ink and dielectric provide surface roughness values of <500 nm. A radio frequency inductor and capacitor exhibit state-of-the-art quality factors of 8 and 20, respectively, and match well with electromagnetic simulations. These components are implemented in a lumped element radio frequency filter with an impressive insertion loss of 0.8 dB at 1 GHz, proving the utility of the process for sensitive radio frequency applications.

  9. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Elżbieta Pociask

    2016-01-01

    Full Text Available Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement, segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  10. Differential Effects of Literacy Instruction Time and Homogeneous Ability Grouping in Kindergarten Classrooms: Who Will Benefit? Who Will Suffer?

    Science.gov (United States)

    Hong, Guanglei; Corter, Carl; Hong, Yihua; Pelletier, Janette

    2012-01-01

    This study challenges the belief that homogeneous ability grouping benefits high-ability students in cognitive and social-emotional development at the expense of their low-ability peers. From a developmental point of view, the authors hypothesize that homogeneous grouping may improve the learning behaviors and may benefit the literacy learning of…

  11. Homogeneous Ir-192 afterloading-flab-irradiation of plane surfaces

    International Nuclear Information System (INIS)

    Bratengeier, K.; Krieger, T.

    2002-01-01

    Homogeneous irradiation of plane targets bt Ir-192 afterloading flabs made by a parallel series of linear applicators can be time-consuming even with modern planning systems. The aim of the present study was to develop an algorithm that supplies homogeneous dose distributions in an arbitrary given plane in parallel to the equipped plane of a flab. The edge and corner positions of the flab are of particular importance. The identity of the dose in the optimisation distance above the flab centre, corners, and middle of the flab edges, leads to a strict relation of the respective dwell weights. Formulas can be derived that allow the calculation of the dwell times. The dimensioning of the flab can be rapidly adapted to new conditions. A comparison with the results of Nucletron PLATO-BPS for applicator-applicator distances and step sizes of 1 cm at optimisation distances of 10, 20, 30, and 40 mm and various flab sizes (3 x 3, 9 x 9, and 15 x 15 cm 2 ) shows the following results: The standard deviation of the proposed algorithm is sometimes slightly higher than the results of the commercial planning system, whereas the underdosage at the flab edges is usually smaller. The effort for planning and preparation of the irradiation, for example using a Nucletron HDR, is below 5 minutes - a considerable reduction of planning time. (orig.) [de

  12. Homogenization techniques for population dynamics in strongly heterogeneous landscapes.

    Science.gov (United States)

    Yurk, Brian P; Cobbold, Christina A

    2018-12-01

    An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.

  13. Numerical Studies of Homogenization under a Fast Cellular Flow

    KAUST Repository

    Iyer, Gautam

    2012-09-13

    We consider a two dimensional particle diffusing in the presence of a fast cellular flow confined to a finite domain. If the flow amplitude A is held fixed and the number of cells L 2 →∞, then the problem homogenizes; this has been well studied. Also well studied is the limit when L is fixed and A→∞. In this case the solution averages along stream lines. The double limit as both the flow amplitude A→∞and the number of cells L 2 →∞was recently studied [G. Iyer et al., preprint, arXiv:1108.0074]; one observes a sharp transition between the homogenization and averaging regimes occurring at A = L 2. This paper numerically studies a few theoretically unresolved aspects of this problem when both A and L are large that were left open in [G. Iyer et al., preprint, arXiv:1108.0074] using the numerical method devised in [G. A. Pavliotis, A. M. Stewart, and K. C. Zygalakis, J. Comput. Phys., 228 (2009), pp. 1030-1055]. Our treatment of the numerical method uses recent developments in the theory of modified equations for numerical integrators of stochastic differential equations [K. C. Zygalakis, SIAM J. Sci. Comput., 33 (2001), pp. 102-130]. © 2012 Society for Industrial and Applied Mathematics.

  14. Numerical Studies of Homogenization under a Fast Cellular Flow

    KAUST Repository

    Iyer, Gautam; Zygalakis, Konstantinos C.

    2012-01-01

    We consider a two dimensional particle diffusing in the presence of a fast cellular flow confined to a finite domain. If the flow amplitude A is held fixed and the number of cells L 2 →∞, then the problem homogenizes; this has been well studied. Also well studied is the limit when L is fixed and A→∞. In this case the solution averages along stream lines. The double limit as both the flow amplitude A→∞and the number of cells L 2 →∞was recently studied [G. Iyer et al., preprint, arXiv:1108.0074]; one observes a sharp transition between the homogenization and averaging regimes occurring at A = L 2. This paper numerically studies a few theoretically unresolved aspects of this problem when both A and L are large that were left open in [G. Iyer et al., preprint, arXiv:1108.0074] using the numerical method devised in [G. A. Pavliotis, A. M. Stewart, and K. C. Zygalakis, J. Comput. Phys., 228 (2009), pp. 1030-1055]. Our treatment of the numerical method uses recent developments in the theory of modified equations for numerical integrators of stochastic differential equations [K. C. Zygalakis, SIAM J. Sci. Comput., 33 (2001), pp. 102-130]. © 2012 Society for Industrial and Applied Mathematics.

  15. Homogeneity of blended nuclear fuel powders after pneumatic transport

    International Nuclear Information System (INIS)

    Smeltzer, E.E.; Skriba, M.C.; Lyon, W.L.

    1982-01-01

    A study of the pneumatic transport of fine (approx. 1μm) cohesive nuclear fuel powders was conducted for the U.S. Department of Energy to demonstrate the feasibility of this method of transport and to develop a design data base for use in a large scale nuclear fuel production facility. As part of this program, a considerable effort was directed at following the homogeneity of blended powders. Since different reactors require different enrichments, blending and subsequent transport are critical parts of the fabrication sequence. The various materials used represented analogs of a wide range of powders and blends that could be expected in a commercial mixed oxide fabrication facility. All UO 2 powders used were depleted and a co-precipitated master mix of (U, Th)O 2 was made specifically for this program, using thorium as an analog for plutonium. In order to determine the effect of pneumatic transport on a blended powder, samples were taken from a feeder vessel before each test, and from a receiver vessel and a few line sections after each transfer test. The average difference between the before and after degree of non-homogeneity was < 1%, for the 21 tests considered. This shows that overall, the pneumatic transport of blended, fine nuclear fuel powders is possible, with only minor unblending occurring

  16. Changes of the Temperature and Precipitation Extremes on Homogenized Data

    Directory of Open Access Journals (Sweden)

    LAKATOS, Mónika

    2007-01-01

    Full Text Available Climate indices to detect changes have been defined in several international projects onclimate change. Climate index calculations require at least daily resolution of time series withoutinhomogeneities, such as transfer of stations, changes in observation practice. In many cases thecharacteristics of the estimated linear trends, calculated from the original and from the homogenizedtime series are significantly different. The ECA&D (European Climate Assessment & Dataset indicesand some other special temperature and precipitation indices of own development were applied to theClimate Database of the Hungarian Meteorological Service. Long term daily maximum, minimum anddaily mean temperature data series and daily precipitation sums were examined. The climate indexcalculation processes were tested on original observations and on homogenized daily data fortemperature; in the case of precipitation a complementation process was performed to fill in the gapsof missing data. Experiences of comparing the climate index calculation results, based on original andcomplemented-homogenized data, are reported in this paper. We present the preliminary result ofclimate index calculations also on gridded (interpolated daily data.

  17. Japanese Fast Reactor Program for Homogeneous Actinide Recycling

    International Nuclear Information System (INIS)

    Ishikawa, Makoto; Nagata, Takashi; Kondo, Satoru

    2008-01-01

    In the present report, the homogeneous actinide recycling scenario of Fast Reactor (FR) Cycle Technology Development Project (FaCT) is summarized. First, the scenario of nuclear energy policy in Japan are briefly reviewed. Second, the basic plan of Japan to manage all minor actinide (MA) by recycling is summarized objectives of which are the efficiency increase of uranium resources, the environmental burden reduction, and the increase of nuclear non-proliferation potential. Third, recent results of reactor physics study related to MA-loaded FR cores are briefly described. Fourth, typical nuclear design of MA-loaded FR cores in the FaCT project and their main features are demonstrated with the feasibility to recycle all MA in the future FR equilibrium society. Finally, the research and development program to realize the MA recycling in Japan is introduced, including international cooperation projects. (authors)

  18. Efisiensi Energi Jaringan Homogeneous Wcdma/3g Pada Lingkungan Outdoor

    Directory of Open Access Journals (Sweden)

    Linawati Linawati

    2013-06-01

    Full Text Available Telecommunication technology and applications have developed fast recently. Hence this development will take energy consumption significantly. Many studies have been done on energy efficiency on cellular network. The studies are more focused on energy usage of the base station, as the base station is the component of cellular station which takes the most energy consumption. Therefore this study analyzes energy efficiency on homogeneous network of WCDMA/3G for outdoor environment. Energy consumption of three macro base stations is compared with energy consumption of 12 micro base stations. This comparison analysis has been conducted on the same Area Spectral Efficiency (ASE. The results show that the macro base stations are more efficient for energy usage than the micro base stations. However based on ASE requirements, the micro base stations are more efficient than the macro base stations on both busy hours and non-busy hour.

  19. On the decay of homogeneous isotropic turbulence

    Science.gov (United States)

    Skrbek, L.; Stalp, Steven R.

    2000-08-01

    Decaying homogeneous, isotropic turbulence is investigated using a phenomenological model based on the three-dimensional turbulent energy spectra. We generalize the approach first used by Comte-Bellot and Corrsin [J. Fluid Mech. 25, 657 (1966)] and revised by Saffman [J. Fluid Mech. 27, 581 (1967); Phys. Fluids 10, 1349 (1967)]. At small wave numbers we assume the spectral energy is proportional to the wave number to an arbitrary power. The specific case of power 2, which follows from the Saffman invariant, is discussed in detail and is later shown to best describe experimental data. For the spectral energy density in the inertial range we apply both the Kolmogorov -5/3 law, E(k)=Cɛ2/3k-5/3, and the refined Kolmogorov law by taking into account intermittency. We show that intermittency affects the energy decay mainly by shifting the position of the virtual origin rather than altering the power law of the energy decay. Additionally, the spectrum is naturally truncated due to the size of the wind tunnel test section, as eddies larger than the physical size of the system cannot exist. We discuss effects associated with the energy-containing length scale saturating at the size of the test section and predict a change in the power law decay of both energy and vorticity. To incorporate viscous corrections to the model, we truncate the spectrum at an effective Kolmogorov wave number kη=γ(ɛ/v3)1/4, where γ is a dimensionless parameter of order unity. We show that as the turbulence decays, viscous corrections gradually become more important and a simple power law can no longer describe the decay. We discuss the final period of decay within the framework of our model, and show that care must be taken to distinguish between the final period of decay and the change of the character of decay due to the saturation of the energy containing length scale. The model is applied to a number of experiments on decaying turbulence. These include the downstream decay of turbulence in

  20. Homogeneous Thorium Fuel Cycles in Candu Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hyland, B.; Dyck, G.R.; Edwards, G.W.R.; Magill, M. [Chalk River Laboratories, Atomic Energy of Canada Limited (Canada)

    2009-06-15

    The CANDU{sup R} reactor has an unsurpassed degree of fuel-cycle flexibility, as a consequence of its fuel-channel design, excellent neutron economy, on-power refueling, and simple fuel bundle [1]. These features facilitate the introduction and full exploitation of thorium fuel cycles in Candu reactors in an evolutionary fashion. Because thorium itself does not contain a fissile isotope, neutrons must be provided by adding a fissile material, either within or outside of the thorium-based fuel. Those same Candu features that provide fuel-cycle flexibility also make possible many thorium fuel-cycle options. Various thorium fuel cycles can be categorized by the type and geometry of the added fissile material. The simplest of these fuel cycles are based on homogeneous thorium fuel designs, where the fissile material is mixed uniformly with the fertile thorium. These fuel cycles can be competitive in resource utilization with the best uranium-based fuel cycles, while building up a 'mine' of U-233 in the spent fuel, for possible recycle in thermal reactors. When U-233 is recycled from the spent fuel, thorium-based fuel cycles in Candu reactors can provide substantial improvements in the efficiency of energy production from existing fissile resources. The fissile component driving the initial fuel could be enriched uranium, plutonium, or uranium-233. Many different thorium fuel cycle options have been studied at AECL [2,3]. This paper presents the results of recent homogeneous thorium fuel cycle calculations using plutonium and enriched uranium as driver fuels, with and without U-233 recycle. High and low burnup cases have been investigated for both the once-through and U-233 recycle cases. CANDU{sup R} is a registered trademark of Atomic Energy of Canada Limited (AECL). 1. Boczar, P.G. 'Candu Fuel-Cycle Vision', Presented at IAEA Technical Committee Meeting on 'Fuel Cycle Options for LWRs and HWRs', 1998 April 28 - May 01, also Atomic Energy

  1. Improvement of the field homogeneity with a permanent magnet assembly for MRI

    International Nuclear Information System (INIS)

    Sakurai, H.; Aoki, M.; Miyamoto, T.

    1990-01-01

    In the last few years, MRI (Magnetic Resonance imaging) has become one of the most excellent and important radiological and diagnostic methods. For this application, a strong and uniform magnetic field is required in the area where the patient is examined. This requirement for a high order of homogeneity is increasing with the rapid progress of tomographic technology. On the other hand, the cost reduction for the magnet is also strongly required. As reported in the last paper, we developed and mass-produced a permanent type magnet using high energy Nd-Fe-B material. This paper presents a newly developed 15 plane measuring method instead of a 7 plane method to evaluate the homogeneous field precisely. By using this analytical method and linear programing method, a new-shaped pole piece has been developed. In consequence, homogeneity was improved twice as much and the magnet weight was reduced 10 % as compared with the formerly developed pole piece. (author)

  2. Persymmetric Adaptive Detectors of Subspace Signals in Homogeneous and Partially Homogeneous Clutter

    Directory of Open Access Journals (Sweden)

    Ding Hao

    2015-08-01

    Full Text Available In the field of adaptive radar detection, an effective strategy to improve the detection performance is to exploit the structural information of the covariance matrix, especially in the case of insufficient reference cells. Thus, in this study, the problem of detecting multidimensional subspace signals is discussed by considering the persymmetric structure of the clutter covariance matrix, which implies that the covariance matrix is persymmetric about its cross diagonal. Persymmetric adaptive detectors are derived on the basis of the one-step principle as well as the two-step Generalized Likelihood Ratio Test (GLRT in homogeneous and partially homogeneous clutter. The proposed detectors consider the structural information of the covariance matrix at the design stage. Simulation results suggest performance improvement compared with existing detectors when reference cells are insufficient. Moreover, the detection performance is assessed with respect to the effects of the covariance matrix, signal subspace dimension, and mismatched performance of signal subspace as well as signal fluctuations.

  3. Impact of homogenization of pasteurized human milk on gastric digestion in the preterm infant: A randomized controlled trial.

    Science.gov (United States)

    de Oliveira, Samira C; Bellanger, Amandine; Ménard, Olivia; Pladys, Patrick; Le Gouar, Yann; Henry, Gwénaële; Dirson, Emelyne; Rousseau, Florence; Carrière, Frédéric; Dupont, Didier; Bourlieu, Claire; Deglaire, Amélie

    2017-08-01

    It has been suggested that homogenization of Holder-pasteurized human milk (PHM) could improve fat absorption and weight gain in preterm infants, but the impact on the PHM digestive kinetics has never been studied. Our objective was to determine the impact of PHM homogenization on gastric digestion in preterm infants. In a randomized controlled trial, eight hospitalized tube-fed preterm infants were their own control to compare the gastric digestion of PHM and of homogenized PHM (PHHM). PHM was obtained from donors and, for half of it, was homogenized by ultrasonication. Over a six-day sequence, gastric aspirates were collected twice a day, before and 35, 60 or 90 min after the start of PHM or PHHM ingestion. The impact of homogenization on PHM digestive kinetics and disintegration was tested using a general linear mixed model. Results were expressed as means ± SD. Homogenization leaded to a six-fold increase in the specific surface (P Homogenization increased the gastric lipolysis level (P Homogenization enhanced the proteolysis of serum albumin (P Homogenization of PHM increased the gastric lipolysis level. This could be a potential strategy to improve fat absorption, and thus growth and development in infants fed with PHM; however, its gastrointestinal tolerance needs to be investigated further. This trial was registered at clinicaltrials.gov as NCT02112331. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  4. Numerical homogenization on approach for stokesian suspensions.

    Energy Technology Data Exchange (ETDEWEB)

    Haines, B. M.; Berlyand, L. V.; Karpeev, D. A. (Mathematics and Computer Science); (Department of Mathematics, Pennsylvania State Univ.)

    2012-01-20

    swimming resulting from bacterial alignment can significantly alter other macroscopic properties of the suspension, such as the oxygen diffusivity and mixing rates. In order to understand the unique macroscopic properties of active suspensions the connection between microscopic swimming and alignment dynamics and the mesoscopic pattern formation must be clarified. This is difficult to do analytically in the fully general setting of moderately dense suspensions, because of the large number of bacteria involved (approx. 10{sup 10} cm{sup -3} in experiments) and the complex, time-dependent geometry of the system. Many reduced analytical models of bacterial have been proposed, but all of them require validation. While comparison with experiment is the ultimate test of a model's fidelity, it is difficult to conduct experiments matched to these models assumptions. Numerical simulation of the microscopic dynamics is an acceptable substitute, but it runs into the problem of having to discretize the fluid domain with a fine-grained boundary (the bacteria) and update the discretization as the domain evolves (bacteria move). This leads to a prohibitively high number of degrees of freedom and prohibitively high setup costs per timestep of simulation. In this technical report we propose numerical methods designed to alleviate these two difficulties. We indicate how to (1) construct an optimal discretization in terms of the number of degrees of freedom per digit of accuracy and (2) optimally update the discretization as the simulation evolves. The technical tool here is the derivation of rigorous error bounds on the error in the numerical solution when using our proposed discretization at the initial time as well as after a given elapsed simulation time. These error bounds should guide the construction of practical discretization schemes and update strategies. Our initial construction is carried out by using a theoretically convenient, but practically prohibitive spectral basis

  5. Highly efficient fully transparent inverted OLEDs

    Science.gov (United States)

    Meyer, J.; Winkler, T.; Hamwi, S.; Schmale, S.; Kröger, M.; Görrn, P.; Johannes, H.-H.; Riedl, T.; Lang, E.; Becker, D.; Dobbertin, T.; Kowalsky, W.

    2007-09-01

    One of the unique selling propositions of OLEDs is their potential to realize highly transparent devices over the visible spectrum. This is because organic semiconductors provide a large Stokes-Shift and low intrinsic absorption losses. Hence, new areas of applications for displays and ambient lighting become accessible, for instance, the integration of OLEDs into the windshield or the ceiling of automobiles. The main challenge in the realization of fully transparent devices is the deposition of the top electrode. ITO is commonly used as transparent bottom anode in a conventional OLED. To obtain uniform light emission over the entire viewing angle and a low series resistance, a TCO such as ITO is desirable as top contact as well. However, sputter deposition of ITO on top of organic layers causes damage induced by high energetic particles and UV radiation. We have found an efficient process to protect the organic layers against the ITO rf magnetron deposition process of ITO for an inverted OLED (IOLED). The inverted structure allows the integration of OLEDs in more powerful n-channel transistors used in active matrix backplanes. Employing the green electrophosphorescent material Ir(ppy) 3 lead to IOLED with a current efficiency of 50 cd/A and power efficiency of 24 lm/W at 100 cd/m2. The average transmittance exceeds 80 % in the visible region. The on-set voltage for light emission is lower than 3 V. In addition, by vertical stacking we achieved a very high current efficiency of more than 70 cd/A for transparent IOLED.

  6. Towards a Fully Conservative Water Balance

    Science.gov (United States)

    Rodriguez, L. B.; Vionnet, C. A.; Younger, P. L.; Parkin, G.

    2001-12-01

    Hydrological modeling is nowadays an essential tool in many aspects of water resources assessment and management. For practical purposes, hydrological models may be defined as mathematical procedures, which transform meteorological input data such as precipitation and evapotranspiration into hydrological output values such as riverflows. Conceptual water balance models are one kind of hydrological models still quite popular among engineers and scientists for three main reasons: firstly the "book-keeping" procedure they are based upon makes them computationally inexpensive, secondly, they require far less data than any physically based model, and thirdly, once calibrated and validated, they can yield the proper order of magnitude of the water cycle component on the basin under investigation. A common criticism of water balance models is their lack of sound theoretical basis. In this work a fully conservative water balance model for basin applications which takes into account physical processes is presented. The two-storage level model contains four calibration parameters: a, b, l and Umax. The saturated storage component resembles the abcd model by Thomas, corrected by the presence of the aquifer storativity coefficient s and the river-aquifer interface conductance l. The resulting model is capable of estimating monthly basin-average of actual evapotranspiration, soil moisture, effective groundwater recharge, groundwater level fluctuations, baseflows and direct runoff using an integral form of the mass conservation law in the saturated/unsaturated layers. The model was applied to a 600 Km2 catchment in the United Kingdom. An eight-year record was used for calibration, while a similar record was reserved for validation of model results. Total streamflows as well as baseflows calculated by the model were compared with observed and estimated data. A quite good agreement was obtained. Finally, simulated groundwater levels were compared with observation data collected at

  7. Theoretical studies of homogeneous catalysts mimicking nitrogenase.

    Science.gov (United States)

    Sgrignani, Jacopo; Franco, Duvan; Magistrato, Alessandra

    2011-01-10

    The conversion of molecular nitrogen to ammonia is a key biological and chemical process and represents one of the most challenging topics in chemistry and biology. In Nature the Mo-containing nitrogenase enzymes perform nitrogen 'fixation' via an iron molybdenum cofactor (FeMo-co) under ambient conditions. In contrast, industrially, the Haber-Bosch process reduces molecular nitrogen and hydrogen to ammonia with a heterogeneous iron catalyst under drastic conditions of temperature and pressure. This process accounts for the production of millions of tons of nitrogen compounds used for agricultural and industrial purposes, but the high temperature and pressure required result in a large energy loss, leading to several economic and environmental issues. During the last 40 years many attempts have been made to synthesize simple homogeneous catalysts that can activate dinitrogen under the same mild conditions of the nitrogenase enzymes. Several compounds, almost all containing transition metals, have been shown to bind and activate N₂ to various degrees. However, to date Mo(N₂)(HIPTN)₃N with (HIPTN)₃N= hexaisopropyl-terphenyl-triamidoamine is the only compound performing this process catalytically. In this review we describe how Density Functional Theory calculations have been of help in elucidating the reaction mechanisms of the inorganic compounds that activate or fix N₂. These studies provided important insights that rationalize and complement the experimental findings about the reaction mechanisms of known catalysts, predicting the reactivity of new potential catalysts and helping in tailoring new efficient catalytic compounds.

  8. Theoretical Studies of Homogeneous Catalysts Mimicking Nitrogenase

    Directory of Open Access Journals (Sweden)

    Alessandra Magistrato

    2011-01-01

    Full Text Available The conversion of molecular nitrogen to ammonia is a key biological and chemical process and represents one of the most challenging topics in chemistry and biology. In Nature the Mo-containing nitrogenase enzymes perform nitrogen ‘fixation’ via an iron molybdenum cofactor (FeMo-co under ambient conditions. In contrast, industrially, the Haber-Bosch process reduces molecular nitrogen and hydrogen to ammonia with a heterogeneous iron catalyst under drastic conditions of temperature and pressure. This process accounts for the production of millions of tons of nitrogen compounds used for agricultural and industrial purposes, but the high temperature and pressure required result in a large energy loss, leading to several economic and environmental issues. During the last 40 years many attempts have been made to synthesize simple homogeneous catalysts that can activate dinitrogen under the same mild conditions of the nitrogenase enzymes. Several compounds, almost all containing transition metals, have been shown to bind and activate N2 to various degrees. However, to date Mo(N2(HIPTN3N with (HIPTN3N= hexaisopropyl-terphenyl-triamidoamine is the only compound performing this process catalytically. In this review we describe how Density Functional Theory calculations have been of help in elucidating the reaction mechanisms of the inorganic compounds that activate or fix N2. These studies provided important insights that rationalize and complement the experimental findings about the reaction mechanisms of known catalysts, predicting the reactivity of new potential catalysts and helping in tailoring new efficient catalytic compounds.

  9. Elastic metamaterials and dynamic homogenization: a review

    Directory of Open Access Journals (Sweden)

    Ankit Srivastava

    2015-01-01

    Full Text Available In this paper, we review the recent advances which have taken place in the understanding and applications of acoustic/elastic metamaterials. Metamaterials are artificially created composite materials which exhibit unusual properties that are not found in nature. We begin with presenting arguments from discrete systems which support the case for the existence of unusual material properties such as tensorial and/or negative density. The arguments are then extended to elastic continuums through coherent averaging principles. The resulting coupled and nonlocal homogenized relations, called the Willis relations, are presented as the natural description of inhomogeneous elastodynamics. They are specialized to Bloch waves propagating in periodic composites and we show that the Willis properties display the unusual behavior which is often required in metamaterial applications such as the Veselago lens. We finally present the recent advances in the area of transformation elastodynamics, charting its inspirations from transformation optics, clarifying its particular challenges, and identifying its connection with the constitutive relations of the Willis and the Cosserat types.

  10. Homogeneous cosmology with aggressively expanding civilizations

    International Nuclear Information System (INIS)

    Jay Olson, S

    2015-01-01

    In the context of a homogeneous Universe, we note that the appearance of aggressively expanding advanced life is geometrically similar to the process of nucleation and bubble growth in a first-order cosmological phase transition. We exploit this similarity to describe the dynamics of life saturating the Universe on a cosmic scale, adapting the phase transition model to incorporate probability distributions of expansion and resource consumption strategies. Through a series of numerical solutions spanning several orders of magnitude in the input assumption parameters, the resulting cosmological model is used to address basic questions related to the intergalactic spreading of life, dealing with issues such as timescales, observability, competition between strategies, and first-mover advantage. Finally, we examine physical effects on the Universe itself, such as reheating and the backreaction on the evolution of the scale factor, if such life is able to control and convert a significant fraction of the available pressureless matter into radiation. We conclude that the existence of life, if certain advanced technologies are practical, could have a significant influence on the future large-scale evolution of the Universe. (paper)

  11. Numerical computation of homogeneous slope stability.

    Science.gov (United States)

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS).

  12. Numerical Computation of Homogeneous Slope Stability

    Directory of Open Access Journals (Sweden)

    Shuangshuang Xiao

    2015-01-01

    Full Text Available To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM and particle swarm optimization algorithm (PSO to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759 were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS.

  13. Thermal neutron diffusion parameters in homogeneous mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Drozdowicz, K.; Krynicka, E. [Institute of Nuclear Physics, Cracow (Poland)

    1995-12-31

    A physical background is presented for a computer program which calculates the thermal neutron diffusion parameters for homogeneous mixtures of any compounds. The macroscopic absorption, scattering and transport cross section of the mixture are defined which are generally function of the incident neutron energy. The energy-averaged neutron parameters are available when these energy dependences and the thermal neutron energy distribution are assumed. Then the averaged diffusion coefficient and the pulsed thermal neutron parameters (the absorption rare and the diffusion constant) are also defined. The absorption cross section is described by the 1/v law and deviations from this behaviour are considered. The scattering cross section can be assumed as being almost constant in the thermal neutron region (which results from the free gas model). Serious deviations are observed for hydrogen atoms bound in molecules and a special study in the paper is devoted to this problem. A certain effective scattering cross section is found in this case on a base of individual exact data for a few hydrogenous media. Approximations assumed for the average cosine of the scattering angle are also discussed. The macroscopic parameters calculated are averaged over the Maxwellian energy distribution for the thermal neutron flux. An information on the input data for the computer program is included. (author). 10 refs, 4 figs, 5 tabs.

  14. Forming homogeneous clusters for differential risk information

    International Nuclear Information System (INIS)

    Maardberg, B.

    1996-01-01

    Latent risk situations are always present in society. General information on these risk situations is supposed to be received differently by different groups of people in the population. In the aftermath of specific accidents different groups presumably have need of specific information about how to act to survive, to avoid injuries, to find more information, to obtain facts about the accidents etc. As targets for information these different groups could be defined in different ways. The conventional way is to divide the population according to demographic variables, such as age, sex, occupation etc. Another way would be to structure the population according to dependent variables measured in different studies. They may concern risk perception, emotional reactions, specific technical knowledge of the accidents, and belief in the information sources. One procedure for forming such groupings of people into homogeneous clusters would be by statistical clustering methods on dependent variables. Examples of such clustering procedures are presented and discussed. Data are from a Norwegian study on the perception of radiation from nuclear accidents and other radiation sources. Speculations are made on different risk information strategies. Elements of a research programme are proposed. (author)

  15. Homogeneous purely buoyancy driven turbulent flow

    Science.gov (United States)

    Arakeri, Jaywant; Cholemari, Murali; Pawar, Shashikant

    2010-11-01

    An unstable density difference across a long vertical tube open at both ends leads to convection that is axially homogeneous with a linear density gradient. We report results from such tube convection experiments, with driving density caused by salt concentration difference or temperature difference. At high enough Rayleigh numbers (Ra) the convection is turbulent with zero mean flow and zero mean Reynolds shear stresses; thus turbulent production is purely by buoyancy. We observe different regimes of turbulent convection. At very high Ra the Nusselt number scales as the square root of the Rayleigh number, giving the so-called "ultimate regime" of convection predicted for Rayleigh-Benard convection in limit of infinite Ra. Turbulent convection at intermediate Ra, the Nusselt number scales as Ra^0.3. In both regimes, the flux and the Taylor scale Reynolds number are more than order of magnitude larger than those obtained in Rayleigh-Benard convection. Absence of a mean flow makes this an ideal flow to study shear free turbulence near a wall.

  16. Thermal neutron diffusion parameters in homogeneous mixtures

    International Nuclear Information System (INIS)

    Drozdowicz, K.; Krynicka, E.

    1995-01-01

    A physical background is presented for a computer program which calculates the thermal neutron diffusion parameters for homogeneous mixtures of any compounds. The macroscopic absorption, scattering and transport cross section of the mixture are defined which are generally function of the incident neutron energy. The energy-averaged neutron parameters are available when these energy dependences and the thermal neutron energy distribution are assumed. Then the averaged diffusion coefficient and the pulsed thermal neutron parameters (the absorption rare and the diffusion constant) are also defined. The absorption cross section is described by the 1/v law and deviations from this behaviour are considered. The scattering cross section can be assumed as being almost constant in the thermal neutron region (which results from the free gas model). Serious deviations are observed for hydrogen atoms bound in molecules and a special study in the paper is devoted to this problem. A certain effective scattering cross section is found in this case on a base of individual exact data for a few hydrogenous media. Approximations assumed for the average cosine of the scattering angle are also discussed. The macroscopic parameters calculated are averaged over the Maxwellian energy distribution for the thermal neutron flux. An information on the input data for the computer program is included. (author). 10 refs, 4 figs, 5 tabs

  17. Radiation statistics in homogeneous isotropic turbulence

    International Nuclear Information System (INIS)

    Da Silva, C B; Coelho, P J; Malico, I

    2009-01-01

    An analysis of turbulence-radiation interaction (TRI) in statistically stationary (forced) homogeneous and isotropic turbulence is presented. A direct numerical simulation code was used to generate instantaneous turbulent scalar fields, and the radiative transfer equation (RTE) was solved to provide statistical data relevant in TRI. The radiation intensity is non-Gaussian and is not spatially correlated with any of the other turbulence or radiation quantities. Its power spectrum exhibits a power-law region with a slope steeper than the classical -5/3 law. The moments of the radiation intensity, Planck-mean and incident-mean absorption coefficients, and emission and absorption TRI correlations are calculated. The influence of the optical thickness of the medium, mean and variance of the temperature and variance of the molar fraction of the absorbing species is studied. Predictions obtained from the time-averaged RTE are also included. It was found that while turbulence yields an increase of the mean blackbody radiation intensity, it causes a decrease of the time-averaged Planck-mean absorption coefficient. The absorption coefficient self-correlation is small in comparison with the temperature self-correlation, and the role of TRI in radiative emission is more important than in radiative absorption. The absorption coefficient-radiation intensity correlation is small, which supports the optically thin fluctuation approximation, and justifies the good predictions often achieved using the time-averaged RTE.

  18. Radiation statistics in homogeneous isotropic turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Da Silva, C B; Coelho, P J [Mechanical Engineering Department, IDMEC/LAETA, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Malico, I [Physics Department, University of Evora, Rua Romao Ramalho, 59, 7000-671 Evora (Portugal)], E-mail: carlos.silva@ist.utl.pt, E-mail: imbm@uevora.pt, E-mail: pedro.coelho@ist.utl.pt

    2009-09-15

    An analysis of turbulence-radiation interaction (TRI) in statistically stationary (forced) homogeneous and isotropic turbulence is presented. A direct numerical simulation code was used to generate instantaneous turbulent scalar fields, and the radiative transfer equation (RTE) was solved to provide statistical data relevant in TRI. The radiation intensity is non-Gaussian and is not spatially correlated with any of the other turbulence or radiation quantities. Its power spectrum exhibits a power-law region with a slope steeper than the classical -5/3 law. The moments of the radiation intensity, Planck-mean and incident-mean absorption coefficients, and emission and absorption TRI correlations are calculated. The influence of the optical thickness of the medium, mean and variance of the temperature and variance of the molar fraction of the absorbing species is studied. Predictions obtained from the time-averaged RTE are also included. It was found that while turbulence yields an increase of the mean blackbody radiation intensity, it causes a decrease of the time-averaged Planck-mean absorption coefficient. The absorption coefficient self-correlation is small in comparison with the temperature self-correlation, and the role of TRI in radiative emission is more important than in radiative absorption. The absorption coefficient-radiation intensity correlation is small, which supports the optically thin fluctuation approximation, and justifies the good predictions often achieved using the time-averaged RTE.

  19. Towards Green Cyclic Carbonate Synthesis : Heterogeneous and Homogeneous Catalyst Development

    NARCIS (Netherlands)

    Stewart, J.A.

    2015-01-01

    This PhD research serves to implement both known and novel catalytic systems for the purpose of cyclic carbonate synthesis from biomass-derived substrates. Such products have been earmarked as potential monomers for non-isocyanate polyurethanes (NIPUs), amongst other uses. Particular attention has

  20. Hardness and microstructure homogeneity of pure copper processed by accumulative back extrusion

    International Nuclear Information System (INIS)

    Bazaz, B.; Zarei-Hanzaki, A.; Fatemi-Varzaneh, S.M.

    2013-01-01

    The present work deals with the microstructure evolution of a pure copper processed by a new severe plastic deformation method. A set of pure copper (99.99%) work-pieces with coarse-grained microstructures was processed by accumulative back extrusion (ABE) method at room temperature. The optical and scanning electron microscopy (SEM) and hardness measurements were utilized to study the microstructural evolution and hardness homogeneity. The results indicated that ABE is a capable process to provide a homogenous grain refined microstructure in pure copper. The observed grain refinement was discussed relying on the occurrence of dynamic restoration processes. The analysis of microstructure and hardness showed outstanding homogeneity improvement throughout the work-pieces as the consecutive ABE passes were applied. The homogeneity improvement was attributed to the propagation of the shear bands and also the heavily deformed regions. A reversing route was also applied in the ABE processing to investigate its effect on the development of microstructural homogeneity. Comparing to the conventional route, the application of the reversing route was found to yield better homogeneity after less passes of the process.

  1. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    Science.gov (United States)

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  2. The Application of Homogenate and Filtrate from Baltic Seaweeds in Seedling Growth Tests

    Directory of Open Access Journals (Sweden)

    Izabela Michalak

    2017-02-01

    Full Text Available Algal filtrate and homogenate, obtained from Baltic seaweeds, were applied in seedling growth tests. Radish seeds were used in order to assess algal products phytotoxicity and their biostimulant effect on growth and nutrient uptake. Algal filtrate, at concentrations ranging from 5.0% to 100% was used for seed soaking and as a liquid biostimulant (soil and foliar application. Algal homogenate was developed for seed coating. Algal filtrate and homogenate were also enriched with Zn(II ions in order to examine the influence on metal ion complexation. The optimal doses of algal filtrate and homogenate, as well as soaking time were established. Multi-elemental analyses of the raw biomass, filtrate, homogenate, and radish were also performed using ICP-OES (Inductively Coupled Plasma—Optical Emission Spectrometry. The best results in terms of seedlings’ length and weight were obtained using clear filtrate at a concentration of 50% applied to the soil and for homogenate applied at a dose of 50 mg/g of seeds. Clear filtrate at a concentration of 50% used for seed soaking for one hour showed the best results. The applied algal products increased the content of elements in seedlings. Among the tested products, a concentration of 50% algal filtrate is recommended for future pot and field experiments.

  3. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  4. Elastic full waveform inversion based on the homogenization method: theoretical framework and 2-D numerical illustrations

    Science.gov (United States)

    Capdeville, Yann; Métivier, Ludovic

    2018-05-01

    Seismic imaging is an efficient tool to investigate the Earth interior. Many of the different imaging techniques currently used, including the so-called full waveform inversion (FWI), are based on limited frequency band data. Such data are not sensitive to the true earth model, but to a smooth version of it. This smooth version can be related to the true model by the homogenization technique. Homogenization for wave propagation in deterministic media with no scale separation, such as geological media, has been recently developed. With such an asymptotic theory, it is possible to compute an effective medium valid for a given frequency band such that effective waveforms and true waveforms are the same up to a controlled error. In this work we make the link between limited frequency band inversion, mainly FWI, and homogenization. We establish the relation between a true model and an FWI result model. This relation is important for a proper interpretation of FWI images. We numerically illustrate, in the 2-D case, that an FWI result is at best the homogenized version of the true model. Moreover, it appears that the homogenized FWI model is quite independent of the FWI parametrization, as long as it has enough degrees of freedom. In particular, inverting for the full elastic tensor is, in each of our tests, always a good choice. We show how the homogenization can help to understand FWI behaviour and help to improve its robustness and convergence by efficiently constraining the solution space of the inverse problem.

  5. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    Directory of Open Access Journals (Sweden)

    Liang Tang

    Full Text Available Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  6. Deep Fully Convolutional Networks for the Detection of Informal Settlements in VHR Images

    NARCIS (Netherlands)

    Persello, Claudio; Stein, Alfred

    2017-01-01

    This letter investigates fully convolutional networks (FCNs) for the detection of informal settlements in very high resolution (VHR) satellite images. Informal settlements or slums are proliferating in developing countries and their detection and classification provides vital information for

  7. Lunar Navigator - A Miniature, Fully Autonomous, Lunar Navigation, Surveyor, and Range Finder System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm proposes to design and develop a fully autonomous Lunar Navigator based on our MicroMak miniature star sensor and a gravity gradiometer similar to one on a...

  8. 7 CFR 718.304 - Failure to fully comply.

    Science.gov (United States)

    2010-01-01

    ... authorized in accordance with § 718.305 if the participant made a good faith effort to comply fully with the... FSA approval official to have made a good faith effort to comply fully with the terms and conditions...

  9. Tolerance, immunocompetence, and secondary disease in fully allogeneic radiation chimeras

    International Nuclear Information System (INIS)

    Rayfield, L.S.; Brent, L.

    1983-01-01

    The aim of this study was to ascertain the extent to which secondary disease and mortality in fully allogeneic chimeras (C57BL leads to CBA) is caused (if at all) by a delayed graft-versus-host reaction. Adult CBA males were thymectomized, irradiated, and reconstituted with T-lymphocyte-depleted C57BL or CBA bone marrow cells (BMC), followed three weeks after irradiation by implantation under the kidney capsule of thymic lobes from C57BL or CBA fetal or adult donors. These mice were observed for the development of secondary disease for periods in excess of 250 days, and they were examined at 5 weeks or 4 months for T lymphocyte reactivity and tolerance to alloantigens, using the cell-mediated lympholysis assay (CML). The following results were obtained. First, removal of T lymphocytes with anti-Thy 1 antibody and complement from allogeneic bone marrow did not prevent wasting and eventual death, although it prolonged the lifespan of mice substantially. Second, T lymphocytes generated from bone marrow-derived precursor cells became tolerant of the histocompatibility antigens of the thymus donor strain but remained normally reactive to third-party antigens. Third, allogeneic radiation chimeras did not survive as well as animals reconstituted with syngeneic cells, even when they were demonstrably tolerant in CML. Fourth, C57BL BMC maturing in a CBA host equipped with a C57BL thymus graft did not become tolerant of host antigens, indicating that extra-thymic tolerance does not occur in fully allogeneic--as opposed to semiallogeneic--chimeras. It is argued that the function of B lymphocytes and/or accessory cells is impaired in fully allogeneic radiation chimeras, and that the mortality observed was directly related to the resulting immunodeficiency. The relevance of the results described in this paper to clinical bone marrow transplantation is discussed

  10. New approach to solve symmetric fully fuzzy linear systems

    Indian Academy of Sciences (India)

    concepts of fuzzy set theory and then define a fully fuzzy linear system of equations. .... To represent the above problem as fully fuzzy linear system, we represent x .... Fully fuzzy linear systems can be solved by Linear programming approach, ...

  11. Ceria powders by homogeneous precipitation technique

    International Nuclear Information System (INIS)

    Ramanathan, S.; Roy, S.K.

    2003-01-01

    Formation of precursors for ceria by two homogeneous precipitation reactions - (cerium chloride + urea at 95 degC - called reaction A and cerium chloride + hexamethylenetetramine at 85 degC - called reaction B) - has been studied. The variation of size of the colloidal particles formed and the zeta potential of the suspensions with progress of reactions exhibited similar trends for both the precipitation processes. Particle size increased from 100 to 300 nm with increasing temperature and extent of reaction. The zeta potential was found to decrease with increasing extent of precipitation in the pH range of 5 to 7. Filtration and drying led to agglomeration of the fine particles in case of the precursor from reaction B. The as-formed precursors were crystalline - a basic carbonate in case of reaction A and hydrous oxide in case of reaction B. It was found that nano-crystalline ceria powders (average crystallite size -10 nm) formed above 400 degC from both these precursors. The agglomerate size (D50) of the precursors and ceria powders formed after calcination at 600 degC varied from 0.7 to 3 μm. Increasing calcination temperature up to 800 degC, increased the crystallite size (50 nm). The zeta potential variation with pH and concentration of an anionic dispersant (Calgon) for the ceria powders formed was studied to determine the ideal conditions for suspension stability. It was found to be maximum (i.e., the suspensions stable) in the pH range of 3 to 4 or Calgon concentration of 0.01 to 0.1 weight percent. (author)

  12. A Modified Homogeneous Balance Method and Its Applications

    International Nuclear Information System (INIS)

    Liu Chunping

    2011-01-01

    A modified homogeneous balance method is proposed by improving some key steps in the homogeneous balance method. Bilinear equations of some nonlinear evolution equations are derived by using the modified homogeneous balance method. Generalized Boussinesq equation, KP equation, and mKdV equation are chosen as examples to illustrate our method. This approach is also applicable to a large variety of nonlinear evolution equations. (general)

  13. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  14. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  15. Elastic waves trapped by a homogeneous anisotropic semicylinder

    Energy Technology Data Exchange (ETDEWEB)

    Nazarov, S A [Institute of Problems of Mechanical Engineering, Russian Academy of Sciences, St.-Petersburg (Russian Federation)

    2013-11-30

    It is established that the problem of elastic oscillations of a homogeneous anisotropic semicylinder (console) with traction-free lateral surface (Neumann boundary condition) has no eigenvalues when the console is clamped at one end (Dirichlet boundary condition). If the end is free, under additional requirements of elastic and geometric symmetry, simple sufficient conditions are found for the existence of an eigenvalue embedded in the continuous spectrum and generating a trapped elastic wave, that is, one which decays at infinity at an exponential rate. The results are obtained by generalizing the methods developed for scalar problems, which however require substantial modification for the vector problem in elasticity theory. Examples are given and open questions are stated. Bibliography: 53 titles.

  16. Histological and pathological characteristics of the homogeneous oral leucoplakia

    International Nuclear Information System (INIS)

    Izaguirre Bordelois, Marioneya; Soriano Gonzalez, Blanca Ines

    2011-01-01

    A descriptive, retrospective, cross-sectional, and analytical study of 35 patients with homogeneous oral leucoplakia associated to the smoking habit, assisted at the Outpatient Department of Maxillofacial Surgery from 'Dr. Juan Bruno Zayas Alfonso' Teaching General Hospital in Santiago de Cuba was carried out from January, 2007 to the same month of 2009, aimed at determining the presence not only of changes of keratinisation as parakeratosis, orthokeratosis, and dyskeratosis, but also of epithelial dysplasia. The primary information was obtained through the histological-pathological study of the biopsies processed in the Pathology Department of the aforementioned institution. The distinctive microscopic characteristic was the hyperparakeratosis, with the presence of the abnormal development of the tissue or without it. (author)

  17. Economical preparation of extremely homogeneous nuclear accelerator targets

    International Nuclear Information System (INIS)

    Maier, H.J.

    1983-01-01

    Techniques for target preparation with a minimum consumption of isotopic material are described. The rotating substrate method, which generates extremely homogeneous targets, is discussed in some detail

  18. Structural changes in heat resisting high nickel alloys during homogenization

    International Nuclear Information System (INIS)

    Kleshchev, A.S.; Korneeva, N.N.; Yurina, O.M.; Guzej, L.S.

    1981-01-01

    Effect of homogenization on the structure and technological plasticity of the KhN73MBTYu and KhN62BMKTYu alloys during treatment with pressure is investigated taking into account peculiarities if the phase composition. It is shown that homogenization of the KhN73MBTYu and KhN62BMKTYu alloys increases the technological plasticity. Homogenization efficiency is conditioned by the change of the grain boundaries and carbide morphology as well as by homogeneous distribution of the large γ'-phase [ru

  19. Sewage sludge solubilization by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  20. A non-asymptotic homogenization theory for periodic electromagnetic structures.

    Science.gov (United States)

    Tsukerman, Igor; Markel, Vadim A

    2014-08-08

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions.