WorldWideScience

Sample records for mc-based homogenous central

  1. Monte Carlo-based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1979-04-01

    The results are presented of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  2. Monte Carlo; based validation of the ENDF/MC2-II/SDX cell homogenization path

    International Nuclear Information System (INIS)

    Wade, D.C.

    1978-11-01

    The results are summarized of a program of validation of the unit cell homogenization prescriptions and codes used for the analysis of Zero Power Reactor (ZPR) fast breeder reactor critical experiments. The ZPR drawer loading patterns comprise both plate type and pin-calandria type unit cells. A prescription is used to convert the three dimensional physical geometry of the drawer loadings into one dimensional calculational models. The ETOE-II/MC 2 -II/SDX code sequence is used to transform ENDF/B basic nuclear data into unit cell average broad group cross sections based on the 1D models. Cell average, broad group anisotropic diffusion coefficients are generated using the methods of Benoist or of Gelbard. The resulting broad (approx. 10 to 30) group parameters are used in multigroup diffusion and S/sub n/ transport calculations of full core XY or RZ models which employ smeared atom densities to represent the contents of the unit cells

  3. Quantum-dot-based homogeneous time-resolved fluoroimmunoassay of alpha-fetoprotein

    Energy Technology Data Exchange (ETDEWEB)

    Chen Meijun; Wu Yingsong; Lin Guanfeng; Hou Jingyuan; Li Ming [Institute of Antibody Engineering, School of Biotechnology, Southern Medical University, Guangzhou, 510515 (China); Liu Tiancai, E-mail: liutc@smu.edu.cn [Institute of Antibody Engineering, School of Biotechnology, Southern Medical University, Guangzhou, 510515 (China)

    2012-09-05

    Highlights: Black-Right-Pointing-Pointer QDs-based homogeneous time-resolved fluoroimmunoassay was developed to detect AFP. Black-Right-Pointing-Pointer The conjugates were prepared with QDs-doped microspheres and anti-AFP McAb. Black-Right-Pointing-Pointer The conjugates were prepared with LTCs and another anti-AFP McAb. Black-Right-Pointing-Pointer Excess amounts of conjugates were used for detecting AFP without rinsing. Black-Right-Pointing-Pointer The wedding of QPs and LTCs was suitable for HTRFIA to detect AFP. - Abstract: Quantum dots (QDs) with novel photoproperties are not widely used in clinic diagnosis, and homogeneous time-resolved fluorescence assays possess many advantages over current methods for alpha-fetoprotein (AFP) detection. A novel QD-based homogeneous time-resolved fluorescence assay was developed and used for detection of AFP, a primary marker for many cancers and diseases. QD-doped carboxyl-modified polystyrene microparticles (QPs) were prepared by doping oil-soluble QDs possessing a 605 nm emission peak. The antibody conjugates (QPs-E014) were prepared from QPs and an anti-AFP monoclonal antibody, and luminescent terbium chelates (LTCs) were prepared and conjugated to a second anti-AFP monoclonal antibody (LTCs-E010). In a double-antibodies sandwich structure, QPs-E014 and LTCs-E010 were used for detection of AFP, serving as energy acceptor and donor, respectively, with an AFP bridge. The results demonstrated that the luminescence lifetime of these QPs was sufficiently long for use in a time-resolved fluoroassay, with the efficiency of time-resolved Foerster resonance transfer (TR-FRET) at 67.3% and the spatial distance of the donor to acceptor calculated to be 66.1 Angstrom-Sign . Signals from TR-FRET were found to be proportional to AFP concentrations. The resulting standard curve was log Y = 3.65786 + 0.43863{center_dot}log X (R = 0.996) with Y the QPs fluorescence intensity and X the AFP concentration; the calculated sensitivity was 0

  4. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  5. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  6. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    International Nuclear Information System (INIS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-01-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon–electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783–97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48–0.53% for the electron beam cases and 0.15–0.17% for the photon beam cases. In terms of efficiency, goMC was ∼4–16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was

  7. Central Andean temperature and precipitation measurements and its homogenization

    Science.gov (United States)

    Hunziker, Stefan; Gubler, Stefanie

    2015-04-01

    Observation of climatological parameters and the homogenization of these time series have a well-established history in western countries. This is not the case for many other countries, such as Bolivia and Peru. In Bolivia and Peru, the organization of measurements, quality of measurement equipment, equipment maintenance, training of staff and data management are fundamentally different compared to the western standard. The data needs special attention, because many problems are not detected by standard quality control procedures. Information about the weather stations, best achieved by station visits, is very beneficial. If the cause of the problem is known, some of the data may be corrected. In this study, cases of typical problems and measurement errors will be demonstrated. Much of research on homogenization techniques (up to subdaily scale) has been completed in recent years. However, data sets of the quality of western station networks have been used, and little is known about the performance of homogenization methods on data sets from countries such as Bolivia and Peru. HOMER (HOMogenizaton softwarE in R) is one of the most recent and widely used homogenization softwares. Its performance is tested on Peruvian-like data that has been sourced from Swiss stations (similar station density and metadata availability). The Swiss station network is a suitable test bed, because climate gradients are strong and the terrain is complex, as is also found in the Central Andes. On the other hand, the Swiss station network is dense, and long time series and extensive metadata are available. By subsampling the station network and omitting the metadata, the conditions of a Peruvian test region are mimicked. Results are compared to a dataset homogenized by THOMAS (Tool for Homogenization of Monthly Data Series), the homogenization tool used by MeteoSwiss.

  8. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  9. ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2016-07-01

    Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.

  10. Qualification of McCARD/MASTER Code System for Yonggwang Unit 4

    International Nuclear Information System (INIS)

    Park, Ho Jin; Shim, Hyung Jin; Joo, Han Gyu; Kim, Chang Hyo

    2011-01-01

    Recently, we have developed the new two-step procedure based on the Monte Carlo (MC) methods. In this procedure, one can generate the few group constants including the few-group diffusion constants by the MC method augmented by the critical spectrum, which is provided by the solution to the homogeneous 0-dimensional B1 equation. In order to examine the qualification of the few-group constants generated by MC method, we combine MASTER with McCARD to form McCARD/MASTER code system for two-step core neutronics calculations. In the fictitious PWR system problems, the core design parameters calculated by the two-step McCARD/MASTER analysis agree well with those from the direct MC calculations. In this paper, a neutronic design analysis for the initial core of Yonggwang Nuclear Unit 4 (YGN4) is conducted using McCARD/MASTER two-step procedure to examine the qualification of two group constants from McCARD in terms of a real PWR core problem. To compare with the results, the nuclear design report and measured data are chosen as the reference solutions

  11. Corporate communication or McCommunication? Considering a McDonaldization of corporate communication hypothesis

    NARCIS (Netherlands)

    Verhoeven, P.

    2015-01-01

    In this essay the perspective of Ritzer's McDonaldization of Society Thesis is the starting point for developing hypotheses about corporate communication (CorpCom). The central idea of McDonaldization is that increasing numbers of organizations are run as fast food restaurants, focusing on:

  12. Keeping an eye on the ring: COMS plaque loading optimization for improved dose conformity and homogeneity.

    Science.gov (United States)

    Gagne, Nolan L; Cutright, Daniel R; Rivard, Mark J

    2012-09-01

    To improve tumor dose conformity and homogeneity for COMS plaque brachytherapy by investigating the dosimetric effects of varying component source ring radionuclides and source strengths. The MCNP5 Monte Carlo (MC) radiation transport code was used to simulate plaque heterogeneity-corrected dose distributions for individually-activated source rings of 14, 16 and 18 mm diameter COMS plaques, populated with (103)Pd, (125)I and (131)Cs sources. Ellipsoidal tumors were contoured for each plaque size and MATLAB programming was developed to generate tumor dose distributions for all possible ring weighting and radionuclide permutations for a given plaque size and source strength resolution, assuming a 75 Gy apical prescription dose. These dose distributions were analyzed for conformity and homogeneity and compared to reference dose distributions from uniformly-loaded (125)I plaques. The most conformal and homogeneous dose distributions were reproduced within a reference eye environment to assess organ-at-risk (OAR) doses in the Pinnacle(3) treatment planning system (TPS). The gamma-index analysis method was used to quantitatively compare MC and TPS-generated dose distributions. Concentrating > 97% of the total source strength in a single or pair of central (103)Pd seeds produced the most conformal dose distributions, with tumor basal doses a factor of 2-3 higher and OAR doses a factor of 2-3 lower than those of corresponding uniformly-loaded (125)I plaques. Concentrating 82-86% of the total source strength in peripherally-loaded (131)Cs seeds produced the most homogeneous dose distributions, with tumor basal doses 17-25% lower and OAR doses typically 20% higher than those of corresponding uniformly-loaded (125)I plaques. Gamma-index analysis found > 99% agreement between MC and TPS dose distributions. A method was developed to select intra-plaque ring radionuclide compositions and source strengths to deliver more conformal and homogeneous tumor dose distributions than

  13. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    International Nuclear Information System (INIS)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X

    2015-01-01

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross

  14. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross

  15. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  16. Deviation from equilibrium conditions in molecular dynamic simulations of homogeneous nucleation.

    Science.gov (United States)

    Halonen, Roope; Zapadinsky, Evgeni; Vehkamäki, Hanna

    2018-04-28

    We present a comparison between Monte Carlo (MC) results for homogeneous vapour-liquid nucleation of Lennard-Jones clusters and previously published values from molecular dynamics (MD) simulations. Both the MC and MD methods sample real cluster configuration distributions. In the MD simulations, the extent of the temperature fluctuation is usually controlled with an artificial thermostat rather than with more realistic carrier gas. In this study, not only a primarily velocity scaling thermostat is considered, but also Nosé-Hoover, Berendsen, and stochastic Langevin thermostat methods are covered. The nucleation rates based on a kinetic scheme and the canonical MC calculation serve as a point of reference since they by definition describe an equilibrated system. The studied temperature range is from T = 0.3 to 0.65 ϵ/k. The kinetic scheme reproduces well the isothermal nucleation rates obtained by Wedekind et al. [J. Chem. Phys. 127, 064501 (2007)] using MD simulations with carrier gas. The nucleation rates obtained by artificially thermostatted MD simulations are consistently lower than the reference nucleation rates based on MC calculations. The discrepancy increases up to several orders of magnitude when the density of the nucleating vapour decreases. At low temperatures, the difference to the MC-based reference nucleation rates in some cases exceeds the maximal nonisothermal effect predicted by classical theory of Feder et al. [Adv. Phys. 15, 111 (1966)].

  17. Homogeneous time-resolved fluoroimmunoassay of microcystin-LR using layered WS2 nanosheets as a transducer

    Science.gov (United States)

    Qin, Xiaodan; Wang, Yuanxiu; Song, Bo; Wang, Xin; Ma, Hua; Yuan, Jingli

    2017-06-01

    A homogeneous time-resolved fluoroimmunoassay method for rapid and sensitive detection of microcystin-LR (MC-LR) in water samples was developed based on the interaction between water-soluble WS2 nanosheets and the conjugate of MC-LR with a luminescent Eu3+ complex BHHBCB-Eu3+ (BHHBCB: 1,2-bis[4‧-(1″,1″,1″,2″,2″,3″,3″-heptafluoro-4″,6″-hexanedion-6″-yl)- benzyl]-4-chlorosulfobenzene). The large lateral dimensions and high surface areas of two-dimensional layered WS2 nanosheets enable easy adsorption of the MC-LR-BHHBCB-Eu3+ conjugate, that lead to efficient quenching of the luminescence of Eu3+ complex via energy transfer or electron transfer process. However, the addition of monoclonal anti-MC-LR antibody can induce the formation of MC-LR-BHHBCB-Eu3+/antibody immune complex, which prevents the interaction between WS2 nanosheets and MC-LR-BHHBCB-Eu3+ to result in the restoration of Eu3+ luminescence. This signal transduction mechanism made it possible for analysis of the target MC-LR in a homogeneous system. The present method has advantages of rapidity and simplicity since the B/F (bound reagent/free reagent) separation steps, the solid-phase carrier and antibody labeling or modification process are not necessary. The proposed immunosensing system displayed a wide linear range, good precision and accuracy, and comparable sensitivity with a detection limit of 0.3 μg l-1, which satisfied the World Health Organization (WHO) provisional guideline limit of 1.0 μg l-1 for MC-LR in drinking water.

  18. Experience in Collaboration: McDenver at McDonald's.

    Science.gov (United States)

    Combs, Clarice Sue

    2002-01-01

    The McDenver at McDonald's project provided a nontraditional, community-based teaching and learning environment for faculty and students in a health, physical education, and recreation (HPER) department and a school of nursing. Children and parents come to McDonald's, children received developmental screenings, and parents completed conferences…

  19. EVALUATING CHAMBERLAIN'S, McGREGOR'S, AND McRAE'S ...

    African Journals Online (AJOL)

    2012-08-08

    Aug 8, 2012 ... spine and base of skull radiographs which however have diagnostic challenges due to the complexity of the ... McGregor's and Mc Rae's using CT bone windows ... metastatic lesion were excluded from the study. RESULTS.

  20. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan

    2016-06-06

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  1. Homogeneous Biosensing Based on Magnetic Particle Labels

    Science.gov (United States)

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  2. Homogeneous Biosensing Based on Magnetic Particle Labels

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschö pe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.

  3. Karlin–McGregor-like formula in a simple time-inhomogeneous birth–death process

    International Nuclear Information System (INIS)

    Ohkubo, Jun

    2014-01-01

    Algebraic discussions are developed to derive transition probabilities for a simple time-inhomogeneous birth–death process. Algebraic probability theory and Lie algebraic treatments make it easy to treat the time-inhomogeneous cases. As a result, an expression based on the Charlier polynomials is obtained, which can be considered as an extension of a famous Karlin–McGregor representation for a time-homogeneous birth–death process. (paper)

  4. Sim1 Neurons Are Sufficient for MC4R-Mediated Sexual Function in Male Mice.

    Science.gov (United States)

    Semple, Erin; Hill, Jennifer W

    2018-01-01

    Sexual dysfunction is a poorly understood condition that affects up to one-third of men around the world. Existing treatments that target the periphery do not work for all men. Previous studies have shown that central melanocortins, which are released by pro-opiomelanocortin neurons in the arcuate nucleus of the hypothalamus, can lead to male erection and increased libido. Several studies specifically implicate the melanocortin 4 receptor (MC4R) in the central control of sexual function, but the specific neural circuitry involved is unknown. We hypothesized that single-minded homolog 1 (Sim1) neurons play an important role in the melanocortin-mediated regulation of male sexual behavior. To test this hypothesis, we examined the sexual behavior of mice expressing MC4R only on Sim1-positive neurons (tbMC4Rsim1 mice) in comparison with tbMC4R null mice and wild-type controls. In tbMC4Rsim1 mice, MC4R reexpression was found in the medial amygdala and paraventricular nucleus of the hypothalamus. These mice were paired with sexually experienced females, and their sexual function and behavior was scored based on mounting, intromission, and ejaculation. tbMC4R null mice showed a longer latency to mount, a reduced intromission efficiency, and an inability to reach ejaculation. Expression of MC4R only on Sim1 neurons reversed the sexual deficits seen in tbMC4R null mice. This study implicates melanocortin signaling via the MC4R on Sim1 neurons in the central control of male sexual behavior. Copyright © 2018 Endocrine Society.

  5. Holocene volcanism of the upper McKenzie River catchment, central Oregon Cascades, USA

    Science.gov (United States)

    Deligne, Natalia I.; Conrey, Richard M.; Cashman, Katharine V.; Champion, Duane E.; Amidon, William H.

    2016-01-01

    To assess the complexity of eruptive activity within mafic volcanic fields, we present a detailed geologic investigation of Holocene volcanism in the upper McKenzie River catchment in the central Oregon Cascades, United States. We focus on the Sand Mountain volcanic field, which covers 76 km2 and consists of 23 vents, associated tephra deposits, and lava fields. We find that the Sand Mountain volcanic field was active for a few decades around 3 ka and involved at least 13 eruptive units. Despite the small total volume erupted (∼1 km3 dense rock equivalent [DRE]), Sand Mountain volcanic field lava geochemistry indicates that erupted magmas were derived from at least two, and likely three, different magma sources. Single units erupted from one or more vents, and field data provide evidence of both vent migration and reoccupation. Overall, our study shows that mafic volcanism was clustered in space and time, involved both explosive and effusive behavior, and tapped several magma sources. These observations provide important insights on possible future hazards from mafic volcanism in the central Oregon Cascades.

  6. Melanocortin MC(4) receptor-mediated feeding and grooming in rodents.

    Science.gov (United States)

    Mul, Joram D; Spruijt, Berry M; Brakkee, Jan H; Adan, Roger A H

    2013-11-05

    Decades ago it was recognized that the pharmacological profile of melanocortin ligands that stimulated grooming behavior in rats was strikingly similar to that of Xenopus laevis melanophore pigment dispersion. After cloning of the melanocortin MC1 receptor, expressed in melanocytes, and the melanocortin MC4 receptor, expressed mainly in brain, the pharmacological profiles of these receptors appeared to be very similar and it was demonstrated that these receptors mediate melanocortin-induced pigmentation and grooming respectively. Grooming is a low priority behavior that is concerned with care of body surface. Activation of central melanocortin MC4 receptors is also associated with meal termination, and continued postprandial stimulation of melanocortin MC4 receptors may stimulate natural postprandial grooming behavior as part of the behavioral satiety sequence. Indeed, melanocortins fail to suppress food intake or induce grooming behavior in melanocortin MC4 receptor-deficient rats. This review will focus on how melanocortins affect grooming behavior through the melanocortin MC4 receptor, and how melanocortin MC4 receptors mediate feeding behavior. This review also illustrates how melanocortins were the most likely candidates to mediate grooming and feeding based on the natural behaviors they induced. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Supernova observations at McDonald Observatory

    International Nuclear Information System (INIS)

    Wheeler, J.C.

    1984-01-01

    The programs to obtain high quality spectra and photometry of supernovae at McDonald Observatory are reviewed. Spectra of recent Type I supernovae in NGC 3227, NGC 3625, and NGC 4419 are compared with those of SN 1981b in NGC 4536 to quantitatively illustrate both the homogeneity of Type I spectra at similar epochs and the differences in detail which will serve as a probe of the physical processes in the explosions. Spectra of the recent supernova in NGC 0991 give for the first time quantitative confirmation of a spectrally homogeneous, but distinct subclass of Type I supernovae which appears to be less luminous and to have lower excitation at maximum light than classical Type I supernovae

  8. Uncertainly propagation analysis for Yonggwang nuclear unit 4 by McCARD/MASTER core analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Dong Hyuk; Shim, Hyung Jin; Kim, Chang Hyo [Seoul National University, Seoul (Korea, Republic of)

    2014-06-15

    This paper concerns estimating uncertainties of the core neutronics design parameters of power reactors by direct sampling method (DSM) calculations based on the two-step McCARD/MASTER design system in which McCARD is used to generate the fuel assembly (FA) homogenized few group constants (FGCs) while MASTER is used to conduct the core neutronics design computation. It presents an extended application of the uncertainty propagation analysis method originally designed for uncertainty quantification of the FA FGCs as a way to produce the covariances between the FGCs of any pair of FAs comprising the core, or the covariance matrix of the FA FGCs required for random sampling of the FA FGCs input sets into direct sampling core calculations by MASTER. For illustrative purposes, the uncertainties of core design parameters such as the effective multiplication factor (k{sub eff}), normalized FA power densities, power peaking factors, etc. for the beginning of life (BOL) core of Yonggwang nuclear unit 4 (YGN4) at the hot zero power and all rods out are estimated by the McCARD/MASTER-based DSM computations. The results are compared with those from the uncertainty propagation analysis method based on the McCARD-predicted sensitivity coefficients of nuclear design parameters and the cross section covariance data.

  9. Nested MC-Based Risk Measurement of Complex Portfolios: Acceleration and Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Sascha Desmettre

    2016-10-01

    Full Text Available Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio risk metrics value-at-risk (VaR and conditional value-at-risk (cVaR by means of nested Monte Carlo (MC simulations. We do so by combining theory and software/hardware implementation. This allows us for the first time to investigate their performance on heterogeneous compute systems and across different compute platforms, namely central processing unit (CPU, many integrated core (MIC architecture XeonPhi, graphics processing unit (GPU, and field-programmable gate array (FPGA. To this end, the OpenCL framework is employed to generate portable code, and the size of the simulations is scaled in order to evaluate variations in performance. Furthermore, we assess different parallelization schemes, and the targeted platforms are evaluated and compared in terms of runtime and energy efficiency. Our implementation also allowed us to derive a new algorithmic optimization regarding the generation of the required random number sequences. Moreover, we provide specific guidelines on how to properly handle these sequences in portable code, and on how to efficiently implement nested MC-based VaR and cVaR simulations on heterogeneous compute systems.

  10. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    Science.gov (United States)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  11. Diversity and biotic homogenization of urban land-snail faunas in relation to habitat types and macroclimate in 32 central European cities.

    Science.gov (United States)

    Horsák, Michal; Lososová, Zdeňka; Čejka, Tomáš; Juřičková, Lucie; Chytrý, Milan

    2013-01-01

    The effects of non-native species invasions on community diversity and biotic homogenization have been described for various taxa in urban environments, but not for land snails. Here we relate the diversity of native and non-native land-snail urban faunas to urban habitat types and macroclimate, and analyse homogenization effects of non-native species across cities and within the main urban habitat types. Land-snail species were recorded in seven 1-ha plots in 32 cities of ten countries of Central Europe and Benelux (224 plots in total). Each plot represented one urban habitat type characterized by different management and a specific disturbance regime. For each plot, we obtained January, July and mean annual temperature and annual precipitation. Snail species were classified into either native or non-native. The effects of habitat type and macroclimate on the number of native and non-native species were analysed using generalized estimating equations; the homogenization effect of non-native species based on the Jaccard similarity index and homogenization index. We recorded 67 native and 20 non-native species. Besides being more numerous, native species also had much higher beta diversity than non-natives. There were significant differences between the studied habitat types in the numbers of native and non-native species, both of which decreased from less to heavily urbanized habitats. Macroclimate was more important for the number of non-native than native species; however in both cases the effect of climate on diversity was overridden by the effect of urban habitat type. This is the first study on urban land snails documenting that non-native land-snail species significantly contribute to homogenization among whole cities, but both the homogenization and diversification effects occur when individual habitat types are compared among cities. This indicates that the spread of non-native snail species may cause biotic homogenization, but it depends on scale and

  12. Metallographic Index-Based Quantification of the Homogenization State in Extrudable Aluminum Alloys

    Directory of Open Access Journals (Sweden)

    Panagiota I. Sarafoglou

    2016-05-01

    Full Text Available Extrudability of aluminum alloys of the 6xxx series is highly dependent on the microstructure of the homogenized billets. It is therefore very important to characterize quantitatively the state of homogenization of the as-cast billets. The quantification of the homogenization state was based on the measurement of specific microstructural indices, which describe the size and shape of the intermetallics and indicate the state of homogenization. The indices evaluated were the following: aspect ratio (AR, which is the ratio of the maximum to the minimum diameter of the particles, feret (F, which is the maximum caliper length, and circularity (C, which is a measure of how closely a particle resembles a circle in a 2D metallographic section. The method included extensive metallographic work and the measurement of a large number of particles, including a statistical analysis, in order to investigate the effect of homogenization time. Among the indices examined, the circularity index exhibited the most consistent variation with homogenization time. The lowest value of the circularity index coincided with the metallographic observation for necklace formation. Shorter homogenization times resulted in intermediate homogenization stages involving rounding of edges or particle pinching. The results indicated that the index-based quantification of the homogenization state could provide a credible method for the selection of homogenization process parameters towards enhanced extrudability.

  13. MC Carbide Characterization in High Refractory Content Powder-Processed Ni-Based Superalloys

    Science.gov (United States)

    Antonov, Stoichko; Chen, Wei; Huo, Jiajie; Feng, Qiang; Isheim, Dieter; Seidman, David N.; Sun, Eugene; Tin, Sammy

    2018-04-01

    Carbide precipitates in Ni-based superalloys are considered to be desirable phases that can contribute to improving high-temperature properties as well as aid in microstructural refinement of the material; however, they can also serve as crack initiation sites during fatigue. To date, most of the knowledge pertaining to carbide formation has originated from assessments of cast and wrought Ni-based superalloys. As powder-processed Ni-based superalloys are becoming increasingly widespread, understanding the different mechanisms by which they form becomes increasingly important. Detailed characterization of MC carbides present in two experimental high Nb-content powder-processed Ni-based superalloys revealed that Hf additions affect the resultant carbide morphologies. This morphology difference was attributed to a higher magnitude of elastic strain energy along the interface associated with Hf being soluble in the MC carbide lattice. The composition of the MC carbides was studied through atom probe tomography and consisted of a complex carbonitride core, which was rich in Nb and with slight Hf segregation, surrounded by an Nb carbide shell. The characterization results of the segregation behavior of Hf in the MC carbides and the subsequent influence on their morphology were compared to density functional theory calculations and found to be in good agreement, suggesting that computational modeling can successfully be used to tailor carbide features.

  14. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Y.; Song, J.; Zheng, H.; Sun, G.; Hao, L.; Long, P.; Hu, L.

    2013-01-01

    SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

  15. The relationship between continuum homogeneity and statistical homogeneity in cosmology

    International Nuclear Information System (INIS)

    Stoeger, W.R.; Ellis, G.F.R.; Hellaby, C.

    1987-01-01

    Although the standard Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe models are based on the concept that the Universe is spatially homogeneous, up to the present time no definition of this concept has been proposed that could in principle be tested by observation. Such a definition is here proposed, based on a simple spatial averaging procedure, which relates observable properties of the Universe to the continuum homogeneity idea that underlies the FLRW models. It turns out that the statistical homogeneity often used to describe the distribution of matter on a large scale does not imply spatial homogeneity according to this definition, and so cannot be simply related to a FLRW Universe model. Values are proposed for the homogeneity parameter and length scale of homogeneity of the Universe. (author)

  16. Epilepsy and McArdle Disease in A Child

    Directory of Open Access Journals (Sweden)

    Faruk incecik

    2015-03-01

    Full Text Available McArdle's disease, defined by the lack of functional glycogen phosphorylase in striated muscle, is inherited as an autosomal recessive trait. Patients typically suffer from reduced exercise tolerance, with muscle cramps and pain provoked by exercise, along with easy fatigability and weakness after exercise. Following prolonged exertion, contractures, rhabdomyolysis, and myoglobinuria may occur. Central nervous system symptoms have rarely been reported in McArdle disease. In this case report, a 13-year-old boy with epilepsy and McArdle's disease is presented. [Cukurova Med J 2015; 40(Suppl 1: 5-7

  17. MC-DS-CDMA System based on DWT and STBC in ITU Multipath Fading Channels Model

    Directory of Open Access Journals (Sweden)

    Nader Abdullah Khadam

    2018-03-01

    Full Text Available In this paper, the performance of multicarrier direct sequence code division multiple access (MC-DS-CDMA in fixed MC-DS-CDMA and Mobile MC-DS-CDMA applications have been improved by using the compensations of space time block coding and Discrete Fast Fourier transforms (FFT or Discrete Wavelets transform DWT. These MC-DS-CDMA systems had been simulated using MATLAB 2015a. Through simulation of the proposed system, various parameters can be changed and tested. The Bit Error Rate (BERs of these systems are obtained over wide range of signal to noise ratio. All simulation results had been compared with each other using different subcarrier size of FFT or DWT with STBC for 1,2,3 and 4 antennas in transmitter and under different ITU multipath fading channels and different Doppler frequencies (fd. The proposed structures of STBC-MC-DS-CDMA system based on (DWT batter than based on (FFT in varies Doppler frequencies and subcarrier size. Also, proposed system with STBC based on 4 transmitters better than other systems based on 1 or 2 or 3 transmitters in all Doppler frequencies and subcarrier size in all simulation results.

  18. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  19. Developing and design a website for mc kalla oy

    OpenAIRE

    Bekele, Henok

    2013-01-01

    This bachelor thesis is about Website development and design. I have a chance to work with Mc kalla Oy. Mc kalla Oy is a construction company from Kempele which was founded 2011. They have projects in Central-Finland, through Northern Finland to Lapland. This thesis is to develop and design a new website to Mc kalla Oy. Wordpress is used to develop the new website. For the development process I use school server (.opiskelijaprojektit.net). The Thesis contains two main parts designing the ...

  20. Biasing transition rate method based on direct MC simulation for probabilistic safety assessment

    Institute of Scientific and Technical Information of China (English)

    Xiao-Lei Pan; Jia-Qun Wang; Run Yuan; Fang Wang; Han-Qing Lin; Li-Qin Hu; Jin Wang

    2017-01-01

    Direct Monte Carlo (MC) simulation is a powerful probabilistic safety assessment method for accounting dynamics of the system.But it is not efficient at simulating rare events.A biasing transition rate method based on direct MC simulation is proposed to solve the problem in this paper.This method biases transition rates of the components by adding virtual components to them in series to increase the occurrence probability of the rare event,hence the decrease in the variance of MC estimator.Several cases are used to benchmark this method.The results show that the method is effective at modeling system failure and is more efficient at collecting evidence of rare events than the direct MC simulation.The performance is greatly improved by the biasing transition rate method.

  1. Optimal truss and frame design from projected homogenization-based topology optimization

    DEFF Research Database (Denmark)

    Larsen, S. D.; Sigmund, O.; Groen, J. P.

    2018-01-01

    In this article, we propose a novel method to obtain a near-optimal frame structure, based on the solution of a homogenization-based topology optimization model. The presented approach exploits the equivalence between Michell’s problem of least-weight trusses and a compliance minimization problem...... using optimal rank-2 laminates in the low volume fraction limit. In a fully automated procedure, a discrete structure is extracted from the homogenization-based continuum model. This near-optimal structure is post-optimized as a frame, where the bending stiffness is continuously decreased, to allow...

  2. Purification and characterization of enterocin MC13 produced by a potential aquaculture probiont Enterococcus faecium MC13 isolated from the gut of Mugil cephalus.

    Science.gov (United States)

    Satish Kumar, R; Kanmani, P; Yuvaraj, N; Paari, K A; Pattukumar, V; Arul, V

    2011-12-01

    A bacteriocin producer strain MC13 was isolated from the gut of Mugil cephalus (grey mullet) and identified as Enterococcus faecium. The bacteriocin of E. faecium MC13 was purified to homogeneity, as confirmed by Tricine sodium dodecyl sulphate - polyacrylamide gel electrophoresis (SDS-PAGE). Reverse-phase high-performance liquid chromatography (HPLC) analysis showed a single active fraction eluted at 26 min, and matrix-assisted laser desorption ionization time of flight (MALDI-TOF) mass spectrometry analysis showed the molecular mass to be 2.148 kDa. The clear zone in native PAGE corresponding to enterocin MC13 band further substantiated its molecular mass. A dialyzed sample (semicrude preparation) of enterocin MC13 was broad spectrum in its action and inhibited important seafood-borne pathogens: Listeria monocytogenes , Vibrio parahaemolyticus, and Vibrio vulnificus. This antibacterial substance was sensitive to proteolytic enzymes: trypsin, protease, and chymotrypsin but insensitive to catalase and lipase, confirming that inhibition was due to the proteinaceous molecule, i.e., bacteriocin, and not due to hydrogen peroxide. Enterocin MC13 tolerated heat treatment (up to 90 °C for 20 min). Enterococcus faecium MC13 was effective in bile salt tolerance, acid tolerance, and adhesion to the HT-29 cell line. These properties reveal the potential of E. faecium MC13 to be a probiotic bacterium. Enterococcus faecium MC13 could be used as potential fish probiotic against pathogens such as V. parahaemolyticus, Vibrio harveyi, and Aeromonas hydrophila in fisheries. Also, this could be a valuable seafood biopreservative against L. monocytogenes.

  3. Evidence for the involvement of MC4 receptors in the central mechanisms of opioid antinociception

    NARCIS (Netherlands)

    Starowicz, Katarzyna

    2005-01-01

    The data described in this thesis extend general knowledge of the involvement of the MC4 receptor in mechanisms of analgesia. The following aspects outlined below constitute novel information. Firstly, the MC4R localization in the DRG is demonstrated. The MC4 receptor was assumed to exist

  4. SU-E-J-55: Dosimetric Evaluation of Centrally Located Lung Tumors: A Monte Carlo (MC) Study of Lung SBRT Planning

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Badkul, R; Jiang, H; Saleh, H; Estes, C; Park, J; Kumar, P; Wang, F [University Kansas Medical Center, Kansas City, KS (United States)

    2014-06-01

    Purpose: To compare dose distributions calculated using the iPlan XVMC algorithm and heterogeneities corrected/uncorrected Pencil Beam (PB-hete/PB-homo) algorithms for SBRT treatments of lung tumors. Methods: Ten patients with centrally located solitary lung tumors were treated using MC-based SBRT to 60Gy in 5 fractions for PTVV100%=95%. ITV was delineated on MIP-images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1–106.5cc(mean=48.6cc). MC-SBRT plans were generated with a combination of non-coplanar conformal arcs/beams using iPlan-XVMC-algorithm (BrainLABiPlan ver.4.1.2) for Novalis-TX consisting of HD-MLCs and 6MV-SRS(1000MU/min) mode, following RTOG 0813 dosimetric criteria. For comparison, PB-hete/PB-homo algorithms were used to re-calculate dose distributions using same beam configurations, MLCs/monitor units. Plans were evaluated with isocenter/maximal/mean doses to PTV. Normal lung doses were evaluated with V5/V10/V20 and mean-lung-dose(MLD), excluding PTV. Other OAR doses such as maximal spinal cord/2cc-esophagus/max bronchial tree (BT/maximal heart doses were tabulated. Results: Maximal/mean/isocenter doses to PTV calculated by PB-hete were uniformly larger than MC plans by a factors of 1.09/1.13/1.07, on average, whereas they were consistently lower by PB-homo by a factors of 0.9/0.84/0.9, respectively. The volume covered by 5Gy/10Gy/20Gy isodose-lines of the lung were comparable (average within±3%) when calculated by PB-hete compared to XVMC, but, consistently lower by PB-homo by a factors of 0.90/0.88/0.85, respectively. MLD was higher with PB-hete by 1.05, but, lower by PB-homo by 0.9, on average, compared to XVMC. XVMC max-cord/max-BT/max-heart and 2cc of esophagus doses were comparable to PB-hete; however, PB-homo underestimates by a factors of 0.82/0.89/0.88/0.86, on average, respectively. Conclusion: PB-hete significantly overestimates dose to PTV relative to XVMC -hence underdosing the target. MC is more complex and accurate with

  5. Virtual reality-based simulation system for nuclear and radiation safety SuperMC/RVIS

    Energy Technology Data Exchange (ETDEWEB)

    He, T.; Hu, L.; Long, P.; Shang, L.; Zhou, S.; Yang, Q.; Zhao, J.; Song, J.; Yu, S.; Cheng, M.; Hao, L., E-mail: liqin.hu@fds.org.cn [Chinese Academy of Sciences, Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Hefei, Anhu (China)

    2015-07-01

    The suggested work scenarios in radiation environment need to be iterative optimized according to the ALARA principle. Based on Virtual Reality (VR) technology and high-precision whole-body computational voxel phantom, a virtual reality-based simulation system for nuclear and radiation safety named SuperMC/RVIS has been developed for organ dose assessment and ALARA evaluation of work scenarios in radiation environment. The system architecture, ALARA evaluation strategy, advanced visualization methods and virtual reality technology used in SuperMC/RVIS are described. A case is presented to show its dose assessment and interactive simulation capabilities. (author)

  6. Virtual reality-based simulation system for nuclear and radiation safety SuperMC/RVIS

    International Nuclear Information System (INIS)

    He, T.; Hu, L.; Long, P.; Shang, L.; Zhou, S.; Yang, Q.; Zhao, J.; Song, J.; Yu, S.; Cheng, M.; Hao, L.

    2015-01-01

    The suggested work scenarios in radiation environment need to be iterative optimized according to the ALARA principle. Based on Virtual Reality (VR) technology and high-precision whole-body computational voxel phantom, a virtual reality-based simulation system for nuclear and radiation safety named SuperMC/RVIS has been developed for organ dose assessment and ALARA evaluation of work scenarios in radiation environment. The system architecture, ALARA evaluation strategy, advanced visualization methods and virtual reality technology used in SuperMC/RVIS are described. A case is presented to show its dose assessment and interactive simulation capabilities. (author)

  7. Environmental Assessment: Demolition of McGuire Central Heat Plant at Joint Base McGuire-Dix-Lakehurst, New Jersey

    Science.gov (United States)

    2012-06-01

    insulation, boiler, holding tank and duct coverings, floor tiles , window caulking/glazing, and corrugated building siding. The asbestos insulation and...facility, with the Bulk Fuel Storage area and the golf course located between them. BOMARC is located several miles from the proposed solar sites...Architectural Resources The Central Heat Plant was constructed in 1956. It is a flat- roofed building originally rectangular in form, and is now L-shaped. The

  8. Generalized eMC implementation for Monte Carlo dose calculation of electron beams from different machine types.

    Science.gov (United States)

    Fix, Michael K; Cygler, Joanna; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter

    2013-05-07

    The electron Monte Carlo (eMC) dose calculation algorithm available in the Eclipse treatment planning system (Varian Medical Systems) is based on the macro MC method and uses a beam model applicable to Varian linear accelerators. This leads to limitations in accuracy if eMC is applied to non-Varian machines. In this work eMC is generalized to also allow accurate dose calculations for electron beams from Elekta and Siemens accelerators. First, changes made in the previous study to use eMC for low electron beam energies of Varian accelerators are applied. Then, a generalized beam model is developed using a main electron source and a main photon source representing electrons and photons from the scattering foil, respectively, an edge source of electrons, a transmission source of photons and a line source of electrons and photons representing the particles from the scrapers or inserts and head scatter radiation. Regarding the macro MC dose calculation algorithm, the transport code of the secondary particles is improved. The macro MC dose calculations are validated with corresponding dose calculations using EGSnrc in homogeneous and inhomogeneous phantoms. The validation of the generalized eMC is carried out by comparing calculated and measured dose distributions in water for Varian, Elekta and Siemens machines for a variety of beam energies, applicator sizes and SSDs. The comparisons are performed in units of cGy per MU. Overall, a general agreement between calculated and measured dose distributions for all machine types and all combinations of parameters investigated is found to be within 2% or 2 mm. The results of the dose comparisons suggest that the generalized eMC is now suitable to calculate dose distributions for Varian, Elekta and Siemens linear accelerators with sufficient accuracy in the range of the investigated combinations of beam energies, applicator sizes and SSDs.

  9. MC2-2, Calculation of Fast Neutron Spectra and Multigroup Cross-Sections from ENDF/B Data

    International Nuclear Information System (INIS)

    2001-01-01

    1 - Description of program or function: MC 2 -2 solves the neutron slowing-down equations using basic neutron data derived from ENDF/B data files to determine fundamental mode spectra for use in generating multigroup neutron cross sections. The current edition includes the ability to treat all ENDF/B-V and -VI data representations. It accommodates high-order P scattering representations and provides numerous capabilities such as isotope mixing, delayed neutron processing, free-format input, and flexibility in output data selection. This edition supersedes previous releases of the MC22 program and the earlier MC2 program. Improved physics algorithms and increased computational efficiency are incorporated. Input data files required by MC2-2 may be generated from ENDF/B data by the code ETOE-2. The hyper-fine-group integral transport theory module of MC2-2, RABANL, is an improved version of the RABBLE/RABID codes. Many of the MC2-2 modules are used in the SDX code. 2 - Methods: The extended transport P1, B1, consistent P1, and consistent B1 fundamental mode ultra-fine-group equations are solved using continuous slowing-down theory and multigroup methods. Fast and accurate resonance integral methods are used in the narrow resonance resolved and unresolved resonance treatments. A fundamental mode homogeneous unit cell calculation is performed using either a multigroup or a continuous slowing-down treatment. Multigroup neutron homogeneous cross sections are generated in an ISOTXS format for an arbitrary group structure. A hyper-fine-group integral transport slowing down calculation (RABANL) is available as an option. RABANL performs a homogeneous or heterogeneous (pin or slab) unit cell calculation over the resonance region (resolved and unresolved) and generates multigroup neutron cross sections in an ISOTXS format. Neutron cross sections are generated by RABANL for the homogeneous unit cell and for each heterogeneous region in the pin or slab unit cell calculation

  10. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    Science.gov (United States)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  11. On Implementing a Homogeneous Interior-Point Algorithm for Nonsymmetric Conic Optimization

    DEFF Research Database (Denmark)

    Skajaa, Anders; Jørgensen, John Bagterp; Hansen, Per Christian

    Based on earlier work by Nesterov, an implementation of a homogeneous infeasible-start interior-point algorithm for solving nonsymmetric conic optimization problems is presented. Starting each iteration from (the vicinity of) the central path, the method computes (nearly) primal-dual symmetric...... approximate tangent directions followed by a purely primal centering procedure to locate the next central primal-dual point. Features of the algorithm include that it makes use only of the primal barrier function, that it is able to detect infeasibilities in the problem and that no phase-I method is needed...

  12. McSustainability and McJustice: Certification, Alternative Food and Agriculture, and Social Change

    Directory of Open Access Journals (Sweden)

    Maki Hatanaka

    2014-11-01

    Full Text Available Alternative food and agriculture movements increasingly rely on market-based approaches, particularly voluntary standards and certification, to advance environmental sustainability and social justice. Using a case study of an ecological shrimp project in Indonesia that became certified organic, this paper raises concerns regarding the impacts of certification on alternative food and agriculture movements, and their aims of furthering sustainability and justice. Drawing on George Ritzer’s McDonaldization framework, I argue that the ecological shrimp project became McDonaldized with the introduction of voluntary standards and certification. Specifically, efficiency, calculability, predictability, and control became key characteristics of the shrimp project. While the introduction of such characteristics increased market access, it also entailed significant costs, including an erosion of trust and marginalization and alienation of farmers. Given such tradeoffs, in concluding I propose that certification is producing particular forms of environmental sustainability and social justice, what I term McSustainability and McJustice. While enabling the expansion of alternative food and agriculture, McSustainability and McJustice tend to allow little opportunity for farmer empowerment and food sovereignty, as well as exclude aspects of sustainable farming or ethical production that are not easily measured, standardized, and validated.

  13. Occurrence of the Microcystins MC-LW and MC-LF in Dutch Surface Waters and Their Contribution to Total Microcystin Toxicity

    Directory of Open Access Journals (Sweden)

    Elisabeth J. Faassen

    2013-07-01

    Full Text Available Microcystins (MCs are the most frequently found cyanobacterial toxins in freshwater systems. Many MC variants have been identified and variants differ in their toxicity. Recent studies showed that the variants MC-LW and MC-LF might be more toxic than MC-LR, the variant that is most abundant and mostly used for risk assessments. As little is known about the presence of these two variants in The Netherlands, we determined their occurrence by analyzing 88 water samples and 10 scum samples for eight MC variants ((dm-7-MC-RR, MC-YR, (dm-7-MC-LR, MC-LY, MC-LW and MC-LF by liquid chromatography with tandem mass spectrometry detection. All analyzed MC variants were detected, and MC-LW and/or MC-LF were present in 32% of the MC containing water samples. When MC-LW and MC-LF were present, they contributed to nearly 10% of the total MC concentrations, but due to their suspected high toxicity, their average contribution to the total MC toxicity was estimated to be at least 45%. Given the frequent occurrence and possible high toxicity of MC-LW and MC-LF, it seems better to base health risk assessments on the toxicity contributions of different MC variants than on MC-LR concentrations alone.

  14. H-Metric: Characterizing Image Datasets via Homogenization Based on KNN-Queries

    Directory of Open Access Journals (Sweden)

    Welington M da Silva

    2012-01-01

    Full Text Available Precision-Recall is one of the main metrics for evaluating content-based image retrieval techniques. However, it does not provide an ample perception of the properties of an image dataset immersed in a metric space. In this work, we describe an alternative metric named H-Metric, which is determined along a sequence of controlled modifications in the image dataset. The process is named homogenization and works by altering the homogeneity characteristics of the classes of the images. The result is a process that measures how hard it is to deal with a set of images in respect to content-based retrieval, offering support in the task of analyzing configurations of distance functions and of features extractors.

  15. MC EMiNEM maps the interaction landscape of the Mediator.

    Directory of Open Access Journals (Sweden)

    Theresa Niederberger

    Full Text Available The Mediator is a highly conserved, large multiprotein complex that is involved essentially in the regulation of eukaryotic mRNA transcription. It acts as a general transcription factor by integrating regulatory signals from gene-specific activators or repressors to the RNA Polymerase II. The internal network of interactions between Mediator subunits that conveys these signals is largely unknown. Here, we introduce MC EMiNEM, a novel method for the retrieval of functional dependencies between proteins that have pleiotropic effects on mRNA transcription. MC EMiNEM is based on Nested Effects Models (NEMs, a class of probabilistic graphical models that extends the idea of hierarchical clustering. It combines mode-hopping Monte Carlo (MC sampling with an Expectation-Maximization (EM algorithm for NEMs to increase sensitivity compared to existing methods. A meta-analysis of four Mediator perturbation studies in Saccharomyces cerevisiae, three of which are unpublished, provides new insight into the Mediator signaling network. In addition to the known modular organization of the Mediator subunits, MC EMiNEM reveals a hierarchical ordering of its internal information flow, which is putatively transmitted through structural changes within the complex. We identify the N-terminus of Med7 as a peripheral entity, entailing only local structural changes upon perturbation, while the C-terminus of Med7 and Med19 appear to play a central role. MC EMiNEM associates Mediator subunits to most directly affected genes, which, in conjunction with gene set enrichment analysis, allows us to construct an interaction map of Mediator subunits and transcription factors.

  16. MC EMiNEM maps the interaction landscape of the Mediator.

    Science.gov (United States)

    Niederberger, Theresa; Etzold, Stefanie; Lidschreiber, Michael; Maier, Kerstin C; Martin, Dietmar E; Fröhlich, Holger; Cramer, Patrick; Tresch, Achim

    2012-01-01

    The Mediator is a highly conserved, large multiprotein complex that is involved essentially in the regulation of eukaryotic mRNA transcription. It acts as a general transcription factor by integrating regulatory signals from gene-specific activators or repressors to the RNA Polymerase II. The internal network of interactions between Mediator subunits that conveys these signals is largely unknown. Here, we introduce MC EMiNEM, a novel method for the retrieval of functional dependencies between proteins that have pleiotropic effects on mRNA transcription. MC EMiNEM is based on Nested Effects Models (NEMs), a class of probabilistic graphical models that extends the idea of hierarchical clustering. It combines mode-hopping Monte Carlo (MC) sampling with an Expectation-Maximization (EM) algorithm for NEMs to increase sensitivity compared to existing methods. A meta-analysis of four Mediator perturbation studies in Saccharomyces cerevisiae, three of which are unpublished, provides new insight into the Mediator signaling network. In addition to the known modular organization of the Mediator subunits, MC EMiNEM reveals a hierarchical ordering of its internal information flow, which is putatively transmitted through structural changes within the complex. We identify the N-terminus of Med7 as a peripheral entity, entailing only local structural changes upon perturbation, while the C-terminus of Med7 and Med19 appear to play a central role. MC EMiNEM associates Mediator subunits to most directly affected genes, which, in conjunction with gene set enrichment analysis, allows us to construct an interaction map of Mediator subunits and transcription factors.

  17. Position-dependency of Fuel Pin Homogenization in a Pressurized Water Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Woong; Kim, Yonghee [Korea Advanced Institute of Science and Technolgy, Daejeon (Korea, Republic of)

    2016-05-15

    By considering the multi-physics effects more comprehensively, it is possible to acquire precise local parameters which can result in a more accurate core design and safety assessment. A conventional approach of the multi-physics neutronics calculation for the pressurized water reactor (PWR) is to apply nodal methods. Since the nodal methods are basically based on the use of assembly-wise homogenized parameters, additional pin power reconstruction processes are necessary to obtain local power information. In the past, pin-by-pin core calculation was impractical due to the limited computational hardware capability. With the rapid advancement of computer technology, it is now perhaps quite practical to perform the direct pin-by-pin core calculation. As such, fully heterogeneous transport solvers based on both stochastic and deterministic methods have been developed for the acquisition of exact local parameters. However, the 3-D transport reactor analysis is still challenging because of the very high computational requirement. Position-dependency of the fuel pin homogenized cross sections in a small PWR core has been quantified via comparison of infinite FA and 2-D whole core calculations with the use of high-fidelity MC simulations. It is found that the pin environmental affect is especially obvious in FAs bordering the baffle reflector regions. It is also noted that the downscattering cross section is rather sensitive to the spectrum changes of the pins. It is expected that the pinwise homogenized cross sections need to be corrected somehow for accurate pin-by-pin core calculations in the peripheral region of the reactor core.

  18. Brown-McLean Syndrome in a Pediatric Patient

    Science.gov (United States)

    Tourkmani, Abdo Karim; Martinez, Jaime D.; Berrones, David; Juárez-Domínguez, Brenda Y.; Beltrán, Francisco; Galor, Anat

    2015-01-01

    The purpose of this manuscript is to report the case of a 12-year-old patient who presented for routine ophthalmic examination after congenital cataract surgery performed at 2 months of age. The patient was diagnosed with bilateral Brown-McLean syndrome by slit lamp examination. No treatment was required because the patient was asymptomatic and had a clear central cornea. This is the first described case of Brown-McLean syndrome in a pediatric patient, representing the importance of clinical examination in the pediatric age group after cataract surgery because of the risk for patients of developing peripheral edema. PMID:26034485

  19. Midlatitude Continental Convective Clouds Experiment (MC3E)

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, MP; Petersen, WA; Del Genio, AD; Giangrande, SE; Heymsfield, A; Heymsfield, G; Hou, AY; Kollias, P; Orr, B; Rutledge, SA; Schwaller, MR; Zipser, E

    2010-04-10

    The Midlatitude Continental Convective Clouds Experiment (MC3E) will take place in central Oklahoma during the April–May 2011 period. The experiment is a collaborative effort between the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility and the National Aeronautics and Space Administration’s (NASA) Global Precipitation Measurement (GPM) mission Ground Validation (GV) program. The field campaign leverages the unprecedented observing infrastructure currently available in the central United States, combined with an extensive sounding array, remote sensing and in situ aircraft observations, NASA GPM ground validation remote sensors, and new ARM instrumentation purchased with American Recovery and Reinvestment Act funding. The overarching goal is to provide the most complete characterization of convective cloud systems, precipitation, and the environment that has ever been obtained, providing constraints for model cumulus parameterizations and space-based rainfall retrieval algorithms over land that have never before been available.

  20. Continuous energy Monte Carlo method based homogenization multi-group constants calculation

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    The efficiency of the standard two-step reactor physics calculation relies on the accuracy of multi-group constants from the assembly-level homogenization process. In contrast to the traditional deterministic methods, generating the homogenization cross sections via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data bank can be used for a wide range of applications, resulting in the versatility using Monte Carlo codes for homogenization. As the first stage to realize Monte Carlo based lattice homogenization, the track length scheme is used as the foundation of cross section generation, which is straight forward. The scattering matrix and Legendre components, however, require special techniques. The Scattering Event method was proposed to solve the problem. There are no continuous energy counterparts in the Monte Carlo calculation for neutron diffusion coefficients. P 1 cross sections were used to calculate the diffusion coefficients for diffusion reactor simulator codes. B N theory is applied to take the leakage effect into account when the infinite lattice of identical symmetric motives is assumed. The MCMC code was developed and the code was applied in four assembly configurations to assess the accuracy and the applicability. At core-level, A PWR prototype core is examined. The results show that the Monte Carlo based multi-group constants behave well in average. The method could be applied to complicated configuration nuclear reactor core to gain higher accuracy. (authors)

  1. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    International Nuclear Information System (INIS)

    Lee, C.C.; Lee, Y.J.; Tung, C.J.; Cheng, H.W.; Chao, T.C.

    2014-01-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R 50% ) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R 50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent R eq,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively. - Highlights: ► Proton dose simulation based on the MCNPX 2.6.0 in homogeneous and CT phantoms. ► CT number (HU) conversion to electron density based on Schneider's approach. ► Good agreement among MCNPX, GEANT4 and FLUKA codes in a homogeneous water phantom. ► Water equivalent R 50 in CT phantoms are compatible to those of NIST database

  2. A new DWT/MC/DPCM video compression framework based on EBCOT

    Science.gov (United States)

    Mei, L. M.; Wu, H. R.; Tan, D. M.

    2005-07-01

    A novel Discrete Wavelet Transform (DWT)/Motion Compensation (MC)/Differential Pulse Code Modulation (DPCM) video compression framework is proposed in this paper. Although the Discrete Cosine Transform (DCT)/MC/DPCM is the mainstream framework for video coders in industry and international standards, the idea of DWT/MC/DPCM has existed for more than one decade in the literature and the investigation is still undergoing. The contribution of this work is twofold. Firstly, the Embedded Block Coding with Optimal Truncation (EBCOT) is used here as the compression engine for both intra- and inter-frame coding, which provides good compression ratio and embedded rate-distortion (R-D) optimization mechanism. This is an extension of the EBCOT application from still images to videos. Secondly, this framework offers a good interface for the Perceptual Distortion Measure (PDM) based on the Human Visual System (HVS) where the Mean Squared Error (MSE) can be easily replaced with the PDM in the R-D optimization. Some of the preliminary results are reported here. They are also compared with benchmarks such as MPEG-2 and MPEG-4 version 2. The results demonstrate that under specified condition the proposed coder outperforms the benchmarks in terms of rate vs. distortion.

  3. The Midlatitude Continental Convective Clouds Experiment (MC3E)

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Mark P.; Petersen, Walt A.; Bansemer, Aaron; Bharadwaj, Nitin; Carey, Larry; Cecil, D. J.; Collis, Scott M.; Del Genio, Anthony D.; Dolan, Brenda A.; Gerlach, J.; Giangrande, Scott; Heymsfield, Andrew J.; Heymsfield, Gerald; Kollias, Pavlos; Lang, T. J.; Nesbitt, Steve W.; Neumann, Andrea; Poellot, M. R.; Rutledge, Steven A.; Schwaller, Mathew R.; Tokay, Ali; Williams, C. R.; Wolff, D. B.; Xie, Shaocheng; Zipser, Edward J.

    2016-10-18

    The Midlatitude Continental Convective Clouds Experiment (MC3E), a field program jointly led by the U.S. Department of Energy’s Atmospheric Radiation Measurement program and the NASA Global Precipitation Measurement (GPM) Mission, was conducted in south-central Oklahoma during April – May 2011. MC3E science objectives were motivated by the need to improve understanding of midlatitude continental convective cloud system lifecycles, microphysics, and GPM precipitation retrieval algorithms. To achieve these objectives a multi-scale surface- and aircraft-based in situ and remote sensing observing strategy was employed. A variety of cloud and precipitation events were sampled during the MC3E, of which results from three deep convective events are highlighted. Vertical structure, air motions, precipitation drop-size distributions and ice properties were retrieved from multi-wavelength radar, profiler, and aircraft observations for an MCS on 11 May. Aircraft observations for another MCS observed on 20 May were used to test agreement between observed radar reflectivities and those calculated with forward-modeled reflectivity and microwave brightness temperatures using in situ particle size distributions and ice water content. Multi-platform observations of a supercell that occurred on 23 May allowed for an integrated analysis of kinematic and microphysical interactions. A core updraft of 25 ms-1 supported growth of hail and large rain drops. Data collected during the MC3E campaign is being used in a number of current and ongoing research projects and is available through the DOE ARM and NASA data archives.

  4. Spatial homogenization method based on the inverse problem

    International Nuclear Information System (INIS)

    Tóta, Ádám; Makai, Mihály

    2015-01-01

    Highlights: • We derive a spatial homogenization method in slab and cylindrical geometries. • The fluxes and the currents on the boundary are preserved. • The reaction rates and the integral of the fluxes are preserved. • We present verification computations utilizing two- and four-energy groups. - Abstract: We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region’s multi-group cross sections; providing that the fluxes, the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved. We consider one-dimensional geometries: a symmetric slab and a homogeneous cylinder. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined concerning the current and the flux integral. The first one derives the boundary currents from the boundary fluxes, while the second one derives the flux integrals from the boundary fluxes. Further RMs can be defined that connects reaction rates to the boundary fluxes. Assuming that these matrices are known, we present formulae that reconstruct the multi-group diffusion cross-section matrix, the diffusion coefficients and the reaction cross sections in case of one-dimensional (1D) homogeneous regions. We apply these formulae to 1D heterogeneous regions and thus obtain a homogenization method. This method produces such an equivalent homogeneous material, that the fluxes and the currents on the external boundary, the reaction rates and the integral of the fluxes are preserved for any boundary fluxes. We carry out the exact derivations in 1D slab and cylindrical geometries. Verification computations for the presented homogenization method were performed using two- and four-group material cross sections, both in a slab and in a cylindrical geometry

  5. McStas

    DEFF Research Database (Denmark)

    Willendrup, Peter Kjær; Farhi, Emmanuel; Bergbäck Knudsen, Erik

    2014-01-01

    experiments. McStas is being actively used for the design-update of the European Spallation Source (ESS) in Lund. This paper includes an introduction to the McStas package, recent and ongoing simulation projects. Further, new features in releases McStas 1.12c and 2.0 are discussed.......The McStas neutron ray-tracing simulation package is a collaboration between Risø DTU, ILL, University of Copenhagen and the PSI. During its lifetime, McStas has evolved to become the world leading software in the area of neutron scattering simulations for instrument design, optimisation, virtual...

  6. COMS eye plaque brachytherapy dosimetry simulations for 103Pd, 125I, and 131Cs

    International Nuclear Information System (INIS)

    Melhus, Christopher S.; Rivard, Mark J.

    2008-01-01

    Monte Carlo (MC) simulations were performed to estimate brachytherapy dose distributions for Collaborative Ocular Melanoma Study (COMS) eye plaques. Brachytherapy seed models 200, 6711, and CS-1 Rev2 carrying 103 Pd, 125 I, and 131 Cs radionuclides, respectively, were modeled and benchmarked against previously published values. Calculated dose rate constants MC Λ were 0.684, 0.924, and 1.052 cGy h -1 U -1 (±2.6%, k=1 uncertainty) for models 200, 6711, and CS-1 Rev2, respectively. The seeds were distributed into 10, 12, 14, 16, 18, 20, and 22 mm-diameter COMS eye plaques. Simulations were performed in both heterogeneous and homogeneous environments, where the latter were in-water and the former included the silastic seed carrier insert and gold-alloy plaque. MC-based homogenous central axis dose distributions agreed within 2%±1% (±1 s.d.) to hand-calculated values. For heterogeneous simulations, notable photon attenuation was observed, with dose reduction at 5 mm of 19%, 11%, and 9% for 103 Pd, 125 I, and 131 Cs, respectively. A depth-dependent correction factor was derived to correct homogenous central-axis dose distributions for plaque component heterogeneities, which were found to be significant at short radial distances

  7. Determining the degree of powder homogeneity using PC-based program

    Directory of Open Access Journals (Sweden)

    Đuragić Olivera M.

    2010-01-01

    Full Text Available The mixing of powders and the quality control of the obtained mixtures are critical operations involved in the processing of granular materials in chemical, metallurgical, food and pharmaceutical industries. Studies on mixing efficiency and the time needed for achieving homogeneity in the powder mashes production have significant importance. Depending on the characteristic of the materials, a number of methods have been used for the homogeneity tests. Very often, the degree of mixing has been determined by analyzing images of particle arrays in the sample using microscopy, photography and/or video tools. In this paper, a new PC-based method for determining the number of particles in the powder homogeneity tests has been developed. Microtracers®, red iron particles, were used as external tracer added before mixing. Iron particles in the samples of the mixtures were separated by rotary magnet and spread onto a filter paper. The filter paper was sprayed with 50% solution of ethanol for color development and the particles counted where the number of spots presented the concentration of added tracer. The number of spots was counted manually, as well as by the developed PC program. The program which analyzes scanned filter papers with spots is based on digital image analyses, where red spots were converted through few filters into a black and white, and counted. Results obtained by manual and PC counting were compared. A high correlation was established between the two counting methods.

  8. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    Science.gov (United States)

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  9. Homogenization of Mammalian Cells.

    Science.gov (United States)

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  10. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    Science.gov (United States)

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  11. Heard Island and McDonald Islands Acoustic Plumes: Split-beam Echo sounder and Deep Tow Camera Observations of Gas Seeps on the Central Kerguelen Plateau

    Science.gov (United States)

    Watson, S. J.; Spain, E. A.; Coffin, M. F.; Whittaker, J. M.; Fox, J. M.; Bowie, A. R.

    2016-12-01

    Heard and McDonald islands (HIMI) are two active volcanic edifices on the Central Kerguelen Plateau. Scientists aboard the Heard Earth-Ocean-Biosphere Interactions voyage in early 2016 explored how this volcanic activity manifests itself near HIMI. Using Simrad EK60 split-beam echo sounder and deep tow camera data from RV Investigator, we recorded the distribution of seafloor emissions, providing the first direct evidence of seabed discharge around HIMI, mapping >244 acoustic plume signals. Northeast of Heard, three distinct plume clusters are associated with bubbles (towed camera) and the largest directly overlies a sub-seafloor opaque zone (sub-bottom profiler) with >140 zones observed within 6.5 km. Large temperature anomalies did not characterize any of the acoustic plumes where temperature data were recorded. We therefore suggest that these plumes are cold methane seeps. Acoustic properties - mean volume backscattering and target strength - and morphology - height, width, depth to surface - of plumes around McDonald resembled those northeast of Heard, also suggesting gas bubbles. We observed no bubbles on extremely limited towed camera data around McDonald; however, visibility was poor. The acoustic response of the plumes at different frequencies (120 kHz vs. 18 kHz), a technique used to classify water column scatterers, differed between HIMI, suggestiing dissimilar target size (bubble radii) distributions. Environmental context and temporal characteristics of the plumes differed between HIMI. Heard plumes were concentrated on flat, sediment rich plains, whereas around McDonald plumes emanated from sea knolls and mounds with hard volcanic seafloor. The Heard plumes were consistent temporally, while the McDonald plumes varied temporally possibly related to tides or subsurface processes. Our data and analyses suggest that HIMI acoustic plumes were likely caused by gas bubbles; however, the bubbles may originate from two or more distinct processes.

  12. Homogenization Kinetics of a Nickel-based Superalloy Produced by Powder Bed Fusion Laser Sintering.

    Science.gov (United States)

    Zhang, Fan; Levine, Lyle E; Allen, Andrew J; Campbell, Carelyn E; Lass, Eric A; Cheruvathur, Sudha; Stoudt, Mark R; Williams, Maureen E; Idell, Yaakov

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  13. Hydrologic Process Parameterization of Electrical Resistivity Imaging of Solute Plumes Using POD McMC

    Science.gov (United States)

    Awatey, M. T.; Irving, J.; Oware, E. K.

    2016-12-01

    Markov chain Monte Carlo (McMC) inversion frameworks are becoming increasingly popular in geophysics due to their ability to recover multiple equally plausible geologic features that honor the limited noisy measurements. Standard McMC methods, however, become computationally intractable with increasing dimensionality of the problem, for example, when working with spatially distributed geophysical parameter fields. We present a McMC approach based on a sparse proper orthogonal decomposition (POD) model parameterization that implicitly incorporates the physics of the underlying process. First, we generate training images (TIs) via Monte Carlo simulations of the target process constrained to a conceptual model. We then apply POD to construct basis vectors from the TIs. A small number of basis vectors can represent most of the variability in the TIs, leading to dimensionality reduction. A projection of the starting model into the reduced basis space generates the starting POD coefficients. At each iteration, only coefficients within a specified sampling window are resimulated assuming a Gaussian prior. The sampling window grows at a specified rate as the number of iteration progresses starting from the coefficients corresponding to the highest ranked basis to those of the least informative basis. We found this gradual increment in the sampling window to be more stable compared to resampling all the coefficients right from the first iteration. We demonstrate the performance of the algorithm with both synthetic and lab-scale electrical resistivity imaging of saline tracer experiments, employing the same set of basis vectors for all inversions. We consider two scenarios of unimodal and bimodal plumes. The unimodal plume is consistent with the hypothesis underlying the generation of the TIs whereas bimodality in plume morphology was not theorized. We show that uncertainty quantification using McMC can proceed in the reduced dimensionality space while accounting for the

  14. central t

    Directory of Open Access Journals (Sweden)

    Manuel R. Piña Monarrez

    2007-01-01

    Full Text Available Dado que la Regresión Ridge (RR, es una estimación sesgada que parte de la solución de la regresión de Mínimos Cuadrados (MC, es vital establecer las condiciones para las que la distribución central t de Student que se utiliza en la prueba de hipótesis en MC, sea también aplicable a la regresión RR. La prueba de este importante resultado se presenta en este artículo.

  15. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  16. Robust PRNG based on homogeneously distributed chaotic dynamics

    International Nuclear Information System (INIS)

    Garasym, Oleg; Taralova, Ina; Lozi, René

    2016-01-01

    This paper is devoted to the design of new chaotic Pseudo Random Number Generator (CPRNG). Exploring several topologies of network of 1-D coupled chaotic mapping, we focus first on two dimensional networks. Two topologically coupled maps are studied: TTL rc non-alternate, and TTL SC alternate. The primary idea of the novel maps has been based on an original coupling of the tent and logistic maps to achieve excellent random properties and homogeneous /uniform/ density in the phase plane, thus guaranteeing maximum security when used for chaos base cryptography. In this aim two new nonlinear CPRNG: MTTL 2 sc and NTTL 2 are proposed. The maps successfully passed numerous statistical, graphical and numerical tests, due to proposed ring coupling and injection mechanisms. (paper)

  17. McDonaldization and Job Insecurity

    Directory of Open Access Journals (Sweden)

    Emeka W. Dumbili

    2013-06-01

    Full Text Available The article examines how and why the McDonaldization of banking system in Nigeria engenders job insecurity. This is imperative because it provides an explicit revelation of the root causes of job insecurity in the sector that other scholars have totally omitted. No Nigerian scholar has applied the thesis in relation to job insecurity, which is the major problem in Nigeria’s banking industry. The article based on the analysis of secondary data and observations, therefore, draws on McDonaldization thesis to examine the upsurge of rationalization in the sector since consolidation exercise began in 2005. The article argues that the sector’s rising rationalization and ensuing efficiency, calculability, predictability, and control are necessary. However, these have inevitably engendered job insecurity and its adverse consequences. Based on the critical analyses of available evidence, the article concludes that the best option is to commence resistance of the McDonaldization processes, especially those that replace human with nonhuman technology or make customers unpaid workers.

  18. Homogeneity of Inorganic Glasses

    DEFF Research Database (Denmark)

    Jensen, Martin; Zhang, L.; Keding, Ralf

    2011-01-01

    Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...

  19. [Gene-gene interaction on central obesity in school-aged children in China].

    Science.gov (United States)

    Fu, L W; Zhang, M X; Wu, L J; Gao, L W; Mi, J

    2017-07-10

    Objective: To investigate possible effect of 6 obesity-associated SNPs in contribution to central obesity and examine whether there is an interaction in the 6 SNPs in the cause of central obesity in school-aged children in China. Methods: A total of 3 502 school-aged children who were included in Beijing Child and Adolescent Metabolic Syndrome (BCAMS) Study were selected, and based on the age and sex specific waist circumference (WC) standards in the BCAMS study, 1 196 central obese cases and 2 306 controls were identified. Genomic DNA was extracted from peripheral blood white cells using the salt fractionation method. A total of 6 single nucleotide polymorphisms ( FTO rs9939609, MC4R rs17782313, BDNF rs6265, PCSK1 rs6235, SH2B1 rs4788102, and CSK rs1378942) were genotyped by TaqMan allelic discrimination assays with the GeneAmp 7900 sequence detection system (Applied Biosystems, Foster City, CA, USA). Logistic regression model was used to investigate the association between 6 SNPs and central obesity. Gene-gene interactions among 6 polymorphic loci were analyzed by using the Generalized Multifactor Dimensionality Reduction (GMDR) method, and then logistic regression model was constructed to confirm the best combination of loci identified in the GMDR. Results: After adjusting gender, age, Tanner stage, physical activity and family history of obesity, the FTO rs9939609-A, MC4R rs17782313-C and BDNF rs6265-G alleles were associated with central obesity under additive genetic model ( OR =1.24, 95 %CI : 1.06-1.45, P =0.008; OR =1.26, 95 %CI : 1.11-1.43, P =2.98×10(-4); OR =1.18, 95 % CI : 1.06-1.32, P =0.003). GMDR analysis showed a significant gene-gene interaction between MC4R rs17782313 and BDNF rs6265 ( P =0.001). The best two-locus combination showed the cross-validation consistency of 10/10 and testing accuracy of 0.539. This interaction showed the maximum consistency and minimum prediction error among all gene-gene interaction models evaluated. Moreover, the

  20. Bilipschitz embedding of homogeneous fractals

    OpenAIRE

    Lü, Fan; Lou, Man-Li; Wen, Zhi-Ying; Xi, Li-Feng

    2014-01-01

    In this paper, we introduce a class of fractals named homogeneous sets based on some measure versions of homogeneity, uniform perfectness and doubling. This fractal class includes all Ahlfors-David regular sets, but most of them are irregular in the sense that they may have different Hausdorff dimensions and packing dimensions. Using Moran sets as main tool, we study the dimensions, bilipschitz embedding and quasi-Lipschitz equivalence of homogeneous fractals.

  1. Long term spatial and temporal rainfall trends and homogeneity analysis in Wainganga basin, Central India

    Directory of Open Access Journals (Sweden)

    Arun Kumar Taxak

    2014-08-01

    Full Text Available Gridded rainfall data of 0.5×0.5° resolution (CRU TS 3.21 was analysed to study long term spatial and temporal trends on annual and seasonal scales in Wainganga river basin located in Central India during 1901–2012. After testing the presence of autocorrelation, Mann–Kendall (Modified Mann–Kendall test was applied to non-auto correlated (auto correlated series to detect the trends in rainfall data. Theil and Sen׳s slope estimator test was used for finding the magnitude of change over a time period. For detecting the most probable change year, Pettitt–Mann–Whitney test was applied. The Rainfall series was then divided into two partial duration series for finding changes in trends before and after the change year. Arc GIS was used to explore spatial patterns of the trends over the entire basin. Though most of the grid points shows a decreasing trend in annual rainfall, only seven grids has a significant decreasing trend during 1901–2012. On the basis of seasonal trend analysis, non-significant increasing trend is observed only in post monsoon season while seven grid points show significant decreasing trend in monsoon rainfall and non-significant in pre-monsoon and winter rainfall over the last 112 years. During the study period, overall a 8.45% decrease in annual rainfall is estimated. The most probable year of change was found to be 1948 in annual and monsoonal rainfall. There is an increasing rainfall trend in the basin during the period 1901–1948, which is reversed during the period 1949–2012 resulting in decreasing rainfall trend in the basin. Homogeneous trends in annual and seasonal rainfall over a grid points is exhibited in the basin by van Belle and Hughes׳ homogeneity trend test.

  2. Identification of Homogeneous Stations for Quality Monitoring Network of Mashhad Aquifer Based on Nitrate Pollution

    Directory of Open Access Journals (Sweden)

    Moslem Akbarzadeh

    2017-01-01

    Full Text Available Introduction: For water resources monitoring, Evaluation of groundwater quality obtained via detailed analysis of pollution data. The most fundamental analysis is to identify the exact measurement of dangerous zones and homogenous station identification in terms of pollution. In case of quality evaluation, the monitoring improvement could be achieved via identifying homogenous wells in terms of pollution. Presenting a method for clustering is essential in large amounts of quality data for aquifer monitoring and quality evaluation, including identification of homogeneous stations of monitoring network and their clustering based on pollution. In this study, with the purpose of Mashhad aquifer quality evaluation, clustering have been studied based on Euclidean distance and Entropy criteria. Cluster analysis is the task of grouping a set of objects in such a way that objects in the same group (called a cluster are more similar (in some sense or another to each other than to those in other groups (clusters. SNI as a combined entropy measure for clustering calculated from dividing mutual information of two values (pollution index values to the joint entropy. These measures apply as similar distance criteria for monitoring stations clustering. Materials and Methods: First, nitrate data (as pollution index and electrical conductivity (EC (as covariate collected from the related locational situation of 287 wells in statistical period 2002 to 2011. Having identified the outlying data and estimating non-observed points by spatial-temporal Kriging method and then standardizes them, the clustering process was carried out. A similar distance of wells calculated through a clustering process based on Euclidean distance and Entropy (SNI criteria. This difference explained by characteristics such as the location of wells (longitude & latitude and the pollution index (nitrate. Having obtained a similar distance of each well to others, the hierarchical clustering

  3. Reflector homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr

    2004-07-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)

  4. Reflector homogenization

    International Nuclear Information System (INIS)

    Sanchez, R.; Ragusa, J.; Santandrea, S.

    2004-01-01

    The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P 0 transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP N core calculations. (Author)

  5. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  6. Design of homogeneous trench-assisted multi-core fibers based on analytical model

    DEFF Research Database (Denmark)

    Ye, Feihong; Tu, Jiajing; Saitoh, Kunimasa

    2016-01-01

    We present a design method of homogeneous trench-assisted multicore fibers (TA-MCFs) based on an analytical model utilizing an analytical expression for the mode coupling coefficient between two adjacent cores. The analytical model can also be used for crosstalk (XT) properties analysis, such as ...

  7. One Hundred Years of Psychiatry at Johns Hopkins: A Story of Meyer to McHugh.

    Science.gov (United States)

    DePaulo, J Raymond

    2017-04-01

    This article describes a history of clinical methods and constructs that guide Psychiatry at Johns Hopkins Phipps Clinic today. The contributions of Adolf Meyer and Paul McHugh are central and closely connected. Both emphasize the clinical examination as the central practice of psychiatry as a specialty within medicine. Meyer's comprehensive examination of the patient became the centerpiece of his approach and was the standard for psychiatrists in the English-speaking world. McHugh, with Phillip Slavney, developed a pluralistic and practical framework for interpreting that history and examination. Both argued against the uncritical use of the modern disease construct. McHugh argues that the disease construct, although fundamental, is but one of four useful "perspectives of psychiatry" and is, thus, an insufficient basis for psychiatric practice. The perspectives could be used as an organizing framework by all physicians who seek a practical and truly personalized approach to the care of patients.

  8. Potential effects of the next 100 billion hamburgers sold by McDonald's.

    Science.gov (United States)

    Spencer, Elsa H; Frank, Erica; McIntosh, Nichole F

    2005-05-01

    McDonald's has sold >100 billion beef-based hamburgers worldwide with a potentially considerable health impact. This paper explores whether there would be any advantages if the next 100 billion burgers were instead plant-based burgers. Nutrient composition of the beef hamburger patty and the McVeggie burger patty were obtained from the McDonald's website; sales data were obtained from the McDonald's customer service. Consuming 100 billion McDonald's beef burgers versus the same company's McVeggie burgers would provide, approximately, on average, an additional 550 million pounds of saturated fat and 1.2 billion total pounds of fat, as well as 1 billion fewer pounds of fiber, 660 million fewer pounds of protein, and no difference in calories. These data suggest that the McDonald's new McVeggie burger represents a less harmful fast-food choice than the beef burger.

  9. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    KAUST Repository

    Gao, Kai

    2015-06-05

    The development of reliable methods for upscaling fine-scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. Therefore, we have proposed a numerical homogenization algorithm based on multiscale finite-element methods for simulating elastic wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that was similar to the rotated staggered-grid finite-difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity in which the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.

  10. Globalization Theory: Lessons from the Exportation of McDonaldization and the New Means of Consumption

    Energy Technology Data Exchange (ETDEWEB)

    Ritzer, George (Maryland, Univ Of - College Pa); Malone, Elizabeth L.(BATTELLE (PACIFIC NW LAB)); Ritzer, George

    2001-07-30

    McDonaldization and the exportation of the new means of consumption tend to support the view that in at least some sectors the world is growing more homogeneous than heterogeneous. Against those globalization theorists who tend to focus on the importance of the local and therefore on heterogeneity, the study of McDonaldization and the new means of consumption emphasizes transnational issues and uniformity throughout the world. Fast-food restaurants do adapt to local markets, but the basic procedures of operation and marketing remain the same across a wide range of international settings. This is true even of indigenous versions. The uniformity is exported by transnational corporations, with nation-states less and less able to control or restrict such exports.

  11. Globalization Theory: Lessons from the Exportation of McDonaldization and the New Means of Consumption

    Energy Technology Data Exchange (ETDEWEB)

    Ritzer, George; Malone, Elizabeth L.

    2000-07-31

    McDonaldization and the exportation of the new means of consumption tend to support the view that in at least some sectors the world is growing more homogeneous than heterogeneous. Against those globalization theorists who tend to focus on the importance of the local and therefore on heterogeneity, the study of McDonaldization and the new means of consumption emphasizes transnational issues and uniformity throughout the world. Fast-food restaurants do adapt to local markets, but the basic procedures of operation and marketing remain the same across a wide range of international settings. This is true even of indigenous versions. The uniformity is exported by transnational corporations, with nation-states less and less able to control or restrict such exports.

  12. Long-term species loss and homogenization of moth communities in Central Europe.

    Science.gov (United States)

    Valtonen, Anu; Hirka, Anikó; Szőcs, Levente; Ayres, Matthew P; Roininen, Heikki; Csóka, György

    2017-07-01

    As global biodiversity continues to decline steeply, it is becoming increasingly important to understand diversity patterns at local and regional scales. Changes in land use and climate, nitrogen deposition and invasive species are the most important threats to global biodiversity. Because land use changes tend to benefit a few species but impede many, the expected outcome is generally decreasing population sizes, decreasing species richness at local and regional scales, and increasing similarity of species compositions across sites (biotic homogenization). Homogenization can be also driven by invasive species or effects of soil eutrophication propagating to higher trophic levels. In contrast, in the absence of increasing aridity, climate warming is predicted to generally increase abundances and species richness of poikilotherms at local and regional scales. We tested these predictions with data from one of the few existing monitoring programmes on biodiversity in the world dating to the 1960s, where the abundance of 878 species of macro-moths have been measured daily at seven sites across Hungary. Our analyses revealed a dramatic rate of regional species loss and homogenization of community compositions across sites. Species with restricted distribution range, specialized diet or dry grassland habitat were more likely than others to disappear from the community. In global context, the contrasting effects of climate change and land use changes could explain why the predicted enriching effects from climate warming are not always realized. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.

  13. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    Science.gov (United States)

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  14. Homogeneous spectral spanning of terahertz semiconductor lasers with radio frequency modulation.

    Science.gov (United States)

    Wan, W J; Li, H; Zhou, T; Cao, J C

    2017-03-08

    Homogeneous broadband and electrically pumped semiconductor radiation sources emitting in the terahertz regime are highly desirable for various applications, including spectroscopy, chemical sensing, and gas identification. In the frequency range between 1 and 5 THz, unipolar quantum cascade lasers employing electron inter-subband transitions in multiple-quantum-well structures are the most powerful semiconductor light sources. However, these devices are normally characterized by either a narrow emission spectrum due to the narrow gain bandwidth of the inter-subband optical transitions or an inhomogeneous broad terahertz spectrum from lasers with heterogeneous stacks of active regions. Here, we report the demonstration of homogeneous spectral spanning of long-cavity terahertz semiconductor quantum cascade lasers based on a bound-to-continuum and resonant phonon design under radio frequency modulation. At a single drive current, the terahertz spectrum under radio frequency modulation continuously spans 330 GHz (~8% of the central frequency), which is the record for single plasmon waveguide terahertz lasers with a bound-to-continuum design. The homogeneous broadband terahertz sources can be used for spectroscopic applications, i.e., GaAs etalon transmission measurement and ammonia gas identification.

  15. An infrared small target detection method based on multiscale local homogeneity measure

    Science.gov (United States)

    Nie, Jinyan; Qu, Shaocheng; Wei, Yantao; Zhang, Liming; Deng, Lizhen

    2018-05-01

    Infrared (IR) small target detection plays an important role in the field of image detection area owing to its intrinsic characteristics. This paper presents a multiscale local homogeneity measure (MLHM) for infrared small target detection, which can enhance the performance of IR small target detection system. Firstly, intra-patch homogeneity of the target itself and the inter-patch heterogeneity between target and the local background regions are integrated to enhance the significant of small target. Secondly, a multiscale measure based on local regions is proposed to obtain the most appropriate response. Finally, an adaptive threshold method is applied to small target segmentation. Experimental results on three different scenarios indicate that the MLHM has good performance under the interference of strong noise.

  16. Novel stability criteria for fuzzy Hopfield neural networks based on an improved homogeneous matrix polynomials technique

    International Nuclear Information System (INIS)

    Feng Yi-Fu; Zhang Qing-Ling; Feng De-Zhi

    2012-01-01

    The global stability problem of Takagi—Sugeno (T—S) fuzzy Hopfield neural networks (FHNNs) with time delays is investigated. Novel LMI-based stability criteria are obtained by using Lyapunov functional theory to guarantee the asymptotic stability of the FHNNs with less conservatism. Firstly, using both Finsler's lemma and an improved homogeneous matrix polynomial technique, and applying an affine parameter-dependent Lyapunov—Krasovskii functional, we obtain the convergent LMI-based stability criteria. Algebraic properties of the fuzzy membership functions in the unit simplex are considered in the process of stability analysis via the homogeneous matrix polynomials technique. Secondly, to further reduce the conservatism, a new right-hand-side slack variables introducing technique is also proposed in terms of LMIs, which is suitable to the homogeneous matrix polynomials setting. Finally, two illustrative examples are given to show the efficiency of the proposed approaches

  17. Design of SC solenoid with high homogeneity

    International Nuclear Information System (INIS)

    Yang Xiaoliang; Liu Zhong; Luo Min; Luo Guangyao; Kang Qiang; Tan Jie; Wu Wei

    2014-01-01

    A novel kind of SC (superconducting) solenoid coil is designed to satisfy the homogeneity requirement of the magnetic field. In this paper, we first calculate the current density distribution of the solenoid coil section through the linear programming method. Then a traditional solenoid and a nonrectangular section solenoid are designed to produce a central field up to 7 T with a homogeneity to the greatest extent. After comparison of the two solenoid coils designed in magnet field quality, fabrication cost and other aspects, the new design of the nonrectangular section of a solenoid coil can be realized through improving the techniques of framework fabrication and winding. Finally, the outlook and error analysis of this kind of SC magnet coil are also discussed briefly. (authors)

  18. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    International Nuclear Information System (INIS)

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant

  19. A Proposed Stochastic Finite Difference Approach Based on Homogenous Chaos Expansion

    Directory of Open Access Journals (Sweden)

    O. H. Galal

    2013-01-01

    Full Text Available This paper proposes a stochastic finite difference approach, based on homogenous chaos expansion (SFDHC. The said approach can handle time dependent nonlinear as well as linear systems with deterministic or stochastic initial and boundary conditions. In this approach, included stochastic parameters are modeled as second-order stochastic processes and are expanded using Karhunen-Loève expansion, while the response function is approximated using homogenous chaos expansion. Galerkin projection is used in converting the original stochastic partial differential equation (PDE into a set of coupled deterministic partial differential equations and then solved using finite difference method. Two well-known equations were used for efficiency validation of the method proposed. First one being the linear diffusion equation with stochastic parameter and the second is the nonlinear Burger's equation with stochastic parameter and stochastic initial and boundary conditions. In both of these examples, the probability distribution function of the response manifested close conformity to the results obtained from Monte Carlo simulation with optimized computational cost.

  20. Investigations into homogenization of electromagnetic metamaterials

    DEFF Research Database (Denmark)

    Clausen, Niels Christian Jerichau

    This dissertation encompasses homogenization methods, with a special interest into their applications to metamaterial homogenization. The first method studied is the Floquet-Bloch method, that is based on the assumption of a material being infinite periodic. Its field can then be expanded in term...

  1. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with

  2. Local Thermodynamic Equilibrium in Laser-Induced Breakdown Spectroscopy: Beyond the McWhirter criterion

    International Nuclear Information System (INIS)

    Cristoforetti, G.; De Giacomo, A.; Dell'Aglio, M.; Legnaioli, S.; Tognoni, E.; Palleschi, V.; Omenetto, N.

    2010-01-01

    In the Laser-Induced Breakdown Spectroscopy (LIBS) technique, the existence of Local Thermodynamic Equilibrium (LTE) is the essential requisite for meaningful application of theoretical Boltzmann-Maxwell and Saha-Eggert expressions that relate fundamental plasma parameters and concentration of analyte species. The most popular criterion reported in the literature dealing with plasma diagnostics, and usually invoked as a proof of the existence of LTE in the plasma, is the McWhirter criterion [R.W.P. McWhirter, in: Eds. R.H. Huddlestone, S.L. Leonard, Plasma Diagnostic Techniques, Academic Press, New York, 1965, pp. 201-264]. However, as pointed out in several papers, this criterion is known to be a necessary but not a sufficient condition to insure LTE. The considerations reported here are meant to briefly review the theoretical analysis underlying the concept of thermodynamic equilibrium and the derivation of the McWhirter criterion, and to critically discuss its application to a transient and non-homogeneous plasma, like that created by a laser pulse on solid targets. Specific examples are given of theoretical expressions involving relaxation times and diffusion coefficients, as well as a discussion of different experimental approaches involving space and time-resolved measurements that could be used to complement a positive result of the calculation of the minimum electron number density required for LTE using the McWhirter formula. It is argued that these approaches will allow a more complete assessment of the existence of LTE and therefore permit a better quantitative result. It is suggested that the mere use of the McWhirter criterion to assess the existence of LTE in laser-induced plasmas should be discontinued.

  3. α-Skew π-McCoy Rings

    Directory of Open Access Journals (Sweden)

    Areej M. Abduldaim

    2013-01-01

    Full Text Available As a generalization of α-skew McCoy rings, we introduce the concept of α-skew π-McCoy rings, and we study the relationships with another two new generalizations, α-skew π1-McCoy rings and α-skew π2-McCoy rings, observing the relations with α-skew McCoy rings, π-McCoy rings, α-skew Armendariz rings, π-regular rings, and other kinds of rings. Also, we investigate conditions such that α-skew π1-McCoy rings imply α-skew π-McCoy rings and α-skew π2-McCoy rings. We show that in the case where R is a nonreduced ring, if R is 2-primal, then R is an α-skew π-McCoy ring. And, let R be a weak (α,δ-compatible ring; if R is an α-skew π1-McCoy ring, then R is α-skew π2-McCoy.

  4. Homogenization-based topology optimization for high-resolution manufacturable micro-structures

    DEFF Research Database (Denmark)

    Groen, Jeroen Peter; Sigmund, Ole

    2018-01-01

    This paper presents a projection method to obtain high-resolution, manufacturable structures from efficient and coarse-scale, homogenization-based topology optimization results. The presented approach bridges coarse and fine scale, such that the complex periodic micro-structures can be represented...... by a smooth and continuous lattice on the fine mesh. A heuristic methodology allows control of the projected topology, such that a minimum length-scale on both solid and void features is ensured in the final result. Numerical examples show excellent behavior of the method, where performances of the projected...

  5. MC 68020 μp architecture

    International Nuclear Information System (INIS)

    Casals, O.; Dejuan, E.; Labarta, J.

    1988-01-01

    The MC68020 is a 32-bit microprocessor object code compatible with the earlier MC68000 and MC68010. In this paper we describe its architecture and two coprocessors: the MC68851 paged memory management unit and the MC68882 floating point coprocessor. Between its most important characteristics we can point up: addressing mode extensions for enhanced support of high level languages, an on-chip instruction cache and full support of virtual memory. (Author)

  6. Detection of Benzoic Acid by an Amperometric Inhibitor Biosensor Based on Mushroom Tissue Homogenate

    Directory of Open Access Journals (Sweden)

    Mustafa Kemal Sezgintürk

    2005-01-01

    Full Text Available An amperometric benzoic acid-sensing inhibitor biosensor was prepared by immobilizing mushroom (Agaricus bisporus tissue homogenate on a Clark-type oxygen electrode. The effects of the quantity of mushroom tissue homogenate, the quantity of gelatin and the effect of the crosslinking agent glutaraldehyde percent on the biosensor were studied. The optimum concentration of phenol used as substrate was 200 μM. The bioanalytical properties of the proposed biosensor, such as dependence of the biosensor response on the pH value and the temperature, were investigated. The biosensor responded linearly to benzoic acid in a concentration range of 25–100 μM. Standard deviation (s.d. was ±0.49 μM for 7 successive determinations at a concentration of 75 μM. The inhibitor biosensor based on mushroom tissue homogenate was applied for the determination of benzoic acid in fizzy lemonade, some fruits and groundwater samples. Results were compared to those obtained using AOAC method, showing a good agreement.

  7. Reductionist Challenges to Explanatory Pluralism : Comment on McCauley

    NARCIS (Netherlands)

    Eronen, Markus I.

    2009-01-01

    In this comment, I first point out some problems in McCauley's defense of the traditional conception of general analytical levels. Then I present certain reductionist arguments against explanatory pluralism that are not based on the New Wave model of intertheoretic reduction, against which McCauley

  8. The McKenzie method compared with manipulation when used adjunctive to information and advice in low back pain patients presenting with centralization or peripheralization. A randomized controlled trial

    DEFF Research Database (Denmark)

    Petersen, Tom; Larsen, Kristian; Nordsteen, Jan

    2011-01-01

    .Methods. A total of 350 patients suffering from low back pain with a duration of more than 6 weeks who presented with centralization or peripheralization of symptoms with or without signs of nerve root involvement, were enrolled in the trial. Main outcome was number of patients with treatment success defined...... a structured exercise programme tailored to the individual patient as well as manual therapy for the treatment of persistent low back pain. There is presently insufficient evidence to recommend the use of specific decision methods tailoring specific therapies to clinical subgroups of patients in primary care...... for more than six weeks presenting with centralization or peripheralization of symptoms, we found the McKenzie method to be slightly more effective than manipulation when used adjunctive to information and advice....

  9. McClellan Nuclear Radiation Center (MNRC) TRIGA reactor: Four years of operations

    International Nuclear Information System (INIS)

    Heidel, C.C.; Richards, W.J.

    1994-01-01

    McClellan Air Force Base, at Sacramento, California, is headquarters for the Sacramento Air Force Logistics Center (SM-ALC). McClellan Air Force Base provides extensive inspection and maintenance capabilities for the F-111, F-1 5, and other military aircraft. Criticality of the MNRC TRIGA reactor was obtained on January 20, 1990 with 63 standard TRIGA fuel elements, three fuel-followed control rods and one air-followed control rod. Presently there are 93 fuel elements in the reactor core. The reactor can be operated at 1 MW steady state power, producing pulses up to three dollars worth of reactivity addition, and can be square waved up to 1 MW. The reactor core contains a circular grid plate and a graphite reflector assembly surrounding the core. Four tangential beam ports installed in the reflector assembly provide a thermal neutron flux to four radiography bays. The reactor tank is twenty-four (24) feet deep, seven and one-half (7.5) feet in diameter, and has a protrusion in the upper portion of the reactor tank. This protrusion is scheduled for use as a neutron thermal collimator in the future. Besides the neutron radiography capabilities, the reactor contains a pneumatic rabbit system, a central thimble, an in-core irradiation facility, and three additional cutouts that provide locations for additional irradiation facilities. The central thimble can be removed along with the B-ring locations of the upper portion of the grid plate to provide an additional and larger in-core irradiation facility. A new upper grid plate has been manufactured to expand one triangular cutout so that larger experiments can be inserted directly into the reactor core. Some operational problems experienced during the first four years of operations are the timeout of the CSC and DAC watchdogs, deterioration of the heat exchanger gaskets, and loss of thermocouples in the instrumented fuel elements. (author)

  10. Homogeneous dispersion of gadolinium oxide nanoparticles into a non-aqueous-based polymer by two surface treatments

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, Jorice, E-mail: jorice.samuel@gmail.com [AREVA T and D UK Ltd, AREVA T and D Research and Technology Centre (United Kingdom); Raccurt, Olivier [NanoChemistry and Nanosafety Laboratory (DRT/LITEN/DTNM/LCSN), CEA Grenoble, Department of NanoMaterials (France); Mancini, Cedric; Dujardin, Christophe; Amans, David; Ledoux, Gilles [Universite de Lyon, Laboratoire de Physico Chimie des Materiaux Luminescents (LPCML) (France); Poncelet, Olivier [NanoChemistry and Nanosafety Laboratory (DRT/LITEN/DTNM/LCSN), CEA Grenoble, Department of NanoMaterials (France); Tillement, Olivier [Universite de Lyon, Laboratoire de Physico Chimie des Materiaux Luminescents (LPCML) (France)

    2011-06-15

    Gadolinium oxide nanoparticles are more and more used. They can notably provide interesting fluorescence properties. Herein they are incorporated into a non-aqueous-based polymer, the poly(methyl methacrylate). Their dispersion within the polymer matrix is the key to improve the composite properties. As-received gadolinium oxide nanopowders cannot be homogeneously dispersed in such a polymer matrix. Two surface treatments are, therefore, detailed and compared to achieve a good stability of the nanoparticles in a non-aqueous solvent such as the 2-butanone. Then, once the liquid suspensions have been stabilized, they are used to prepare nanocomposites with homogeneous particles dispersion. The two approaches proposed are an hybrid approach based on the growth of a silica shell around the gadolinium oxide nanoparticles, and followed by a suitable silane functionalization; and a non-hybrid approach based on the use of surfactants. The surface treatments and formulations involved in both methods are detailed, adjusted and compared. Thanks to optical methods and in particular to the use of a 'home made' confocal microscope, the dispersion homogeneity within the polymer can be assessed. Both methods provide promising and conclusive results.

  11. Homogeneous lithium electrodeposition with pyrrolidinium-based ionic liquid electrolytes.

    Science.gov (United States)

    Grande, Lorenzo; von Zamory, Jan; Koch, Stephan L; Kalhoff, Julian; Paillard, Elie; Passerini, Stefano

    2015-03-18

    In this study, we report on the electroplating and stripping of lithium in two ionic liquid (IL) based electrolytes, namely N-butyl-N-methylpyrrolidinium bis(fluorosulfonyl) imide (Pyr14FSI) and N-butyl-N-methylpyrrolidinium bis(trifluoromethanesulfonyl)imide (Pyr14TFSI), and mixtures thereof, both on nickel and lithium electrodes. An improved method to evaluate the Li cycling efficiency confirmed that homogeneous electroplating (and stripping) of Li is possible with TFSI-based ILs. Moreover, the presence of native surface features on lithium, directly observable via scanning electron microscope imaging, was used to demonstrate the enhanced electrolyte interphase (SEI)-forming ability, that is, fast cathodic reactivity of this class of electrolytes and the suppressed dendrite growth. Finally, the induced inhomogeneous deposition enabled us to witness the SEI cracking and revealed previously unreported bundled Li fibers below the pre-existing SEI and nonrod-shaped protuberances resulting from Li extrusion.

  12. A mircocontroller MC68HC908GP32 based intelligent scalar

    International Nuclear Information System (INIS)

    Liu Huiying

    2001-01-01

    A Mircocontroller MC68HC908GP32 based intelligent scalar is presented. By replacing traditional IC modular with Mircocontroller, the new type scalar can provide new functions, such as countering rate measurement, control signal output, LCD display, PC control, etc., in addition to traditional functions of normal scalar. This intelligent scalar achieved comprehensive technical innovation to the traditional nuclear electronic instrument, with regard to the design methodology, structure and functions. In this way, the overall technical performance of the new type scalar, such as counting rate, accuracy, volume, cost and operation, etc., has been improved obviously, with bright prospects for application and dissemination

  13. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  14. Dynamics-based centrality for directed networks.

    Science.gov (United States)

    Masuda, Naoki; Kori, Hiroshi

    2010-11-01

    Determining the relative importance of nodes in directed networks is important in, for example, ranking websites, publications, and sports teams, and for understanding signal flows in systems biology. A prevailing centrality measure in this respect is the PageRank. In this work, we focus on another class of centrality derived from the Laplacian of the network. We extend the Laplacian-based centrality, which has mainly been applied to strongly connected networks, to the case of general directed networks such that we can quantitatively compare arbitrary nodes. Toward this end, we adopt the idea used in the PageRank to introduce global connectivity between all the pairs of nodes with a certain strength. Numerical simulations are carried out on some networks. We also offer interpretations of the Laplacian-based centrality for general directed networks in terms of various dynamical and structural properties of networks. Importantly, the Laplacian-based centrality defined as the stationary density of the continuous-time random walk with random jumps is shown to be equivalent to the absorption probability of the random walk with sinks at each node but without random jumps. Similarly, the proposed centrality represents the importance of nodes in dynamics on the original network supplied with sinks but not with random jumps.

  15. Improved algorithms and advanced features of the CAD to MC conversion tool McCad

    International Nuclear Information System (INIS)

    Lu, L.; Fischer, U.; Pereslavtsev, P.

    2014-01-01

    Highlights: •The latest improvements of the McCad conversion approach including decomposition and void filling algorithms is presented. •An advanced interface for the materials editing and assignment has been developed and added to the McCAD GUI. •These improvements have been tested and successfully applied to DEMO and ITER NBI (Neutral Beam Injector) applications. •The performance of the CAD model conversion process is shown to be significantly improved. -- Abstract: McCad is a geometry conversion tool developed at KIT to enable the automatic bi-directional conversions of CAD models into the Monte Carlo (MC) geometries utilized for neutronics calculations (CAD to MC) and, reversed (MC to CAD), for visualization purposes. The paper presents the latest improvements of the conversion algorithms including improved decomposition, void filling and an advanced interface for the materials editing and assignment. The new implementations and features were tested on fusion neutronics applications to the DEMO and ITER NBI (Neutral Beam Injector) models. The results demonstrate greater stability and enhanced efficiency of McCad conversion process

  16. Human reliability-based MC and A methods for evaluating the effectiveness of protecting nuclear material - 59379

    International Nuclear Information System (INIS)

    Duran, Felicia A.; Wyss, Gregory D.

    2012-01-01

    Material control and accountability (MC and A) operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. MC and A activities, from monitoring to inventory measurements, provide critical information about target materials and define security elements that are useful against insider threats. However, these activities have been difficult to characterize in ways that are compatible with the path analysis methods that are used to systematically evaluate the effectiveness of a site's protection system. The path analysis methodology focuses on a systematic, quantitative evaluation of the physical protection component of the system for potential external threats, and often calculates the probability that the physical protection system (PPS) is effective (PE) in defeating an adversary who uses that attack pathway. In previous work, Dawson and Hester observed that many MC and A activities can be considered a type of sensor system with alarm and assessment capabilities that provide recurring opportunities for 'detecting' the status of critical items. This work has extended that characterization of MC and A activities as probabilistic sensors that are interwoven within each protection layer of the PPS. In addition, MC and A activities have similar characteristics to operator tasks performed in a nuclear power plant (NPP) in that the reliability of these activities depends significantly on human performance. Many of the procedures involve human performance in checking for anomalous conditions. Further characterization of MC and A activities as operational procedures that check the status of critical assets provides a basis for applying human reliability analysis (HRA) models and methods to determine probabilities of detection for MC and A protection elements. This paper will discuss the application of HRA methods used in nuclear power plant probabilistic risk assessments to define detection

  17. Homogeneous non-competitive bioaffinity assay based on fluorescence resonance energy transfer

    International Nuclear Information System (INIS)

    Kokko, Tiina; Kokko, Leena; Soukka, Tero; Loevgren, Timo

    2007-01-01

    A homogeneous non-competitive assay principle for measurement of small analytes based on quenching of fluorescence is described. Fluorescence resonance energy transfer (FRET) occurs between the donor, intrinsically fluorescent europium(III)-chelate conjugated to streptavidin, and the acceptor, quencher dye conjugated to biotin derivative when the biotin-quencher is bound to Eu-streptavidin. Fluorescence can be measured only from those streptavidins that are bound to biotin of the sample, while the fluorescence of the streptavidins that are not occupied by biotin are quenched by quencher-biotin conjugates. The quenching efficiencies of the non-fluorescent quencher dyes were over 95% and one dye molecule was able to quench the fluorescence of more than one europium(III)-chelate. This, however, together with the quadrovalent nature of streptavidin limited the measurable range of the assay to 0.2-2 nmol L -1 . In this study we demonstrated that FRET could be used to design a non-competitive homogeneous assay for a small analyte resulting in equal performance with competitive heterogeneous assay

  18. Space-Time Coded MC-CDMA: Blind Channel Estimation, Identifiability, and Receiver Design

    Directory of Open Access Journals (Sweden)

    Li Hongbin

    2002-01-01

    Full Text Available Integrating the strengths of multicarrier (MC modulation and code division multiple access (CDMA, MC-CDMA systems are of great interest for future broadband transmissions. This paper considers the problem of channel identification and signal combining/detection schemes for MC-CDMA systems equipped with multiple transmit antennas and space-time (ST coding. In particular, a subspace based blind channel identification algorithm is presented. Identifiability conditions are examined and specified which guarantee unique and perfect (up to a scalar channel estimation when knowledge of the noise subspace is available. Several popular single-user based signal combining schemes, namely the maximum ratio combining (MRC and the equal gain combining (EGC, which are often utilized in conventional single-transmit-antenna based MC-CDMA systems, are extended to the current ST-coded MC-CDMA (STC-MC-CDMA system to perform joint combining and decoding. In addition, a linear multiuser minimum mean-squared error (MMSE detection scheme is also presented, which is shown to outperform the MRC and EGC at some increased computational complexity. Numerical examples are presented to evaluate and compare the proposed channel identification and signal detection/combining techniques.

  19. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    Science.gov (United States)

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  20. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    International Nuclear Information System (INIS)

    Baranyai, L.

    1983-01-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with 198 Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed. (orig.) [de

  1. Radiotracer investigation of cement raw meal homogenizers. Pt. 2

    Energy Technology Data Exchange (ETDEWEB)

    Baranyai, L

    1983-12-01

    Based on radioisotopic tracer technique a method has been worked out to study the homogenization and segregation processes of cement-industrial raw meal homogenizers. On-site measurements were carried out by this method in some Hungarian cement works to determine the optimal homogenization parameters of operating homogenizers. The motion and distribution of different raw meal fractions traced with /sup 198/Au radioisotope was studied in homogenization processes proceeding with different parameters. In the first part of the publication the change of charge homogenity in time was discussed which had been measured as the resultant of mixing and separating processes. In the second part the parameters and types of homogenizers influencing the efficiency of homogenization have been detailed.

  2. 'And end history. And go to the stars': Terence McKenna and 2012

    NARCIS (Netherlands)

    Hanegraaff, W.J.; Cusack, C.M.; Hartney, C.

    2010-01-01

    Terence McKenna (1946-2000) was a central fi gure in the underground New Age culture mostly referred to as ‘psychedelic shamanism’. In a book published together with his brother Dennis (! e Invisible Landscape, 1975) he developed a grand macrohistorical theory called the ‘Eschaton Timewave’, which

  3. Regionalization Study of Satellite based Hydrological Model (SHM) in Hydrologically Homogeneous River Basins of India

    Science.gov (United States)

    Kumari, Babita; Paul, Pranesh Kumar; Singh, Rajendra; Mishra, Ashok; Gupta, Praveen Kumar; Singh, Raghvendra P.

    2017-04-01

    A new semi-distributed conceptual hydrological model, namely Satellite based Hydrological Model (SHM), has been developed under 'PRACRITI-2' program of Space Application Centre (SAC), Ahmedabad for sustainable water resources management of India by using data from Indian Remote Sensing satellites. Entire India is divided into 5km x 5km grid cells and properties at the center of the cells are assumed to represent the property of the cells. SHM contains five modules namely surface water, forest, snow, groundwater and routing. Two empirical equations (SCS-CN and Hargreaves) and water balance method have been used in the surface water module; the forest module is based on the calculations of water balancing & dynamics of subsurface. 2-D Boussinesq equation is used for groundwater modelling which is solved using implicit finite-difference. The routing module follows a distributed routing approach which requires flow path and network with the key point of travel time estimation. The aim of this study is to evaluate the performance of SHM using regionalization technique which also checks the usefulness of a model in data scarce condition or for ungauged basins. However, homogeneity analysis is pre-requisite to regionalization. Similarity index (Φ) and hierarchical agglomerative cluster analysis are adopted to test the homogeneity in terms of physical attributes of three basins namely Brahmani (39,033 km km^2)), Baitarani (10,982 km km^2)) and Kangsabati (9,660 km km^2)) with respect to Subarnarekha (29,196 km km^2)) basin. The results of both homogeneity analysis show that Brahmani basin is the most homogeneous with respect to Subarnarekha river basin in terms of physical characteristics (land use land cover classes, soiltype and elevation). The calibration and validation of model parameters of Brahmani basin is in progress which are to be transferred into the SHM set up of Subarnarekha basin and results are to be compared with the results of calibrated and validated

  4. Metal ion-mediated agonism and agonist enhancement in melanocortin MC1 and MC4 receptors

    DEFF Research Database (Denmark)

    Holst, Birgitte; Elling, Christian E; Schwartz, Thue W

    2002-01-01

    -melanocortin stimulating hormone (alpha-MSH) in the MC1 and MC4 receptors, respectively. In the presence of peptide agonist, Zn(II) acted as an enhancer on both receptors, because it shifted the dose-response curves to the left: most pronounced was a 6-fold increase in alpha-MSH potency on the MC1 receptor. The effect......An endogenous metal-ion site in the melanocortin MC1 and MC4 receptors was characterized mainly in transiently transfected COS-7 cells. ZnCl(2) alone stimulated signaling through the Gs pathway with a potency of 11 and 13 microm and an efficacy of 50 and 20% of that of alpha...... affinities and profiles were similar for a number of the 2,2'-bipyridine and 1,10-phenanthroline analogs in complex with Zn(II) in the MC1 and MC4 receptors. However, the potencies and efficacies of the metal-ion complexes were very different in the two receptors, and close to full agonism was obtained...

  5. Graphene-based chemiluminescence resonance energy transfer for homogeneous immunoassay.

    Science.gov (United States)

    Lee, Joon Seok; Joung, Hyou-Arm; Kim, Min-Gon; Park, Chan Beum

    2012-04-24

    We report on chemiluminescence resonance energy transfer (CRET) between graphene nanosheets and chemiluminescent donors. In contrast to fluorescence resonance energy transfer, CRET occurs via nonradiative dipole-dipole transfer of energy from a chemiluminescent donor to a suitable acceptor molecule without an external excitation source. We designed a graphene-based CRET platform for homogeneous immunoassay of C-reactive protein (CRP), a key marker for human inflammation and cardiovascular diseases, using a luminol/hydrogen peroxide chemiluminescence (CL) reaction catalyzed by horseradish peroxidase. According to our results, anti-CRP antibody conjugated to graphene nanosheets enabled the capture of CRP at the concentration above 1.6 ng mL(-1). In the CRET platform, graphene played a key role as an energy acceptor, which was more efficient than graphene oxide, while luminol served as a donor to graphene, triggering the CRET phenomenon between luminol and graphene. The graphene-based CRET platform was successfully applied to the detection of CRP in human serum samples in the range observed during acute inflammatory stress.

  6. Sygeplejefagets teorigrundlag som værn mod McDonaldisering

    DEFF Research Database (Denmark)

    Norlyk, Annelise; Haahr, Anita; Dreyer, Pia

    2017-01-01

    , and that values from evidence-based medicine are being lost in the transformation into this practice which potentially leads to a McDonaldization of nursing practice reflected as «one best way». To prevent a practice of «McNursing», we argue for reviving ethics of care perspectives in today’s evidence practice...

  7. Application of the McDonald MRI criteria in multiple sclerosis.

    Science.gov (United States)

    Chan, Ling Ling; Sitoh, Yih Yian; Chong, June; See, Siew Ju; Umapathi, Thirugnanam N; Lim, Shih Hui; Ong, Benjamin

    2007-08-01

    The aim of this study was to assess the sensitivity of McDonald's magnetic resonance imaging (MRI) criteria for the diagnosis of multiple sclerosis (MS) in a group of Asian patients diagnosed with clinically definite MS, based on lesion characterisation on MRI scans. Forty-nine patients from 3 major neurological institutions were classified as having Asian- or Western-type MS based on clinical assessment. Each MRI scan was reviewed by 2 neuroradiologists for the presence and characteristics of brain and spinal lesions. The McDonald's MRI criteria were then applied and its sensitivity evaluated. Nine patients were excluded, leaving 34 females and 6 males who were dominantly Chinese (90%), with a mean age of 36.2 years. The MRI brain and spinal findings were detailed and tabulated. Statistically significant differences (P McDonald's MRI criteria were found between our Asian- and Western-type MS patients. The diagnostic yield of McDonald's MRI criteria increased by 20% when we substituted a cord for a brain lesion, and applied the substitution for enhancing cord lesions as well. The diagnosis is more likely to be made when using McDonald MRI criteria based on brain findings, in a patient who presents clinically with Western-type MS. The provision for substitution of "one brain for a spinal lesion" is helpful in Asian-type MS, where there is preponderance of spinal lesion load. Our findings suggest that minor modifications in the interpretation of McDonald's MRI criteria have significant impact on the diagnosis in patients clinically presenting as Asian-type MS, with potential bearing on their subsequent management.

  8. Improving queuing service at McDonald's

    Science.gov (United States)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  9. 77 FR 27214 - Applications for New Awards; Ronald E. McNair Postbaccalaureate Achievement Program

    Science.gov (United States)

    2012-05-09

    ..., individuals with disabilities, and women, who are provided with access to rigorous and engaging coursework in.... This would enable McNair projects to better measure the success of students selected for participation..., Taxpayer Identification Number, and Central Contractor Registry: To do business with the Department of...

  10. Elastic full waveform inversion based on the homogenization method: theoretical framework and 2-D numerical illustrations

    Science.gov (United States)

    Capdeville, Yann; Métivier, Ludovic

    2018-05-01

    Seismic imaging is an efficient tool to investigate the Earth interior. Many of the different imaging techniques currently used, including the so-called full waveform inversion (FWI), are based on limited frequency band data. Such data are not sensitive to the true earth model, but to a smooth version of it. This smooth version can be related to the true model by the homogenization technique. Homogenization for wave propagation in deterministic media with no scale separation, such as geological media, has been recently developed. With such an asymptotic theory, it is possible to compute an effective medium valid for a given frequency band such that effective waveforms and true waveforms are the same up to a controlled error. In this work we make the link between limited frequency band inversion, mainly FWI, and homogenization. We establish the relation between a true model and an FWI result model. This relation is important for a proper interpretation of FWI images. We numerically illustrate, in the 2-D case, that an FWI result is at best the homogenized version of the true model. Moreover, it appears that the homogenized FWI model is quite independent of the FWI parametrization, as long as it has enough degrees of freedom. In particular, inverting for the full elastic tensor is, in each of our tests, always a good choice. We show how the homogenization can help to understand FWI behaviour and help to improve its robustness and convergence by efficiently constraining the solution space of the inverse problem.

  11. Homogeneous electrochemical aptamer-based ATP assay with signal amplification by exonuclease III assisted target recycling.

    Science.gov (United States)

    Liu, Shufeng; Wang, Ying; Zhang, Chengxin; Lin, Ying; Li, Feng

    2013-03-21

    A novel and homogeneous electrochemical aptamer-based adenosine triphosphate (ATP) assay was demonstrated with signal amplification by exonuclease III-assisted target recycling. A superior detection limit of 1 nM toward ATP with an excellent selectivity could be achieved.

  12. Contamination in sediments, bivalves and sponges of McMurdo Sound, Antarctica

    Energy Technology Data Exchange (ETDEWEB)

    Negri, Andrew [Australian Institute of Marine Science, PMB 3, Townsville, Qld (Australia)]. E-mail: a.negri@aims.gov.au; Burns, Kathryn [Australian Institute of Marine Science, PMB 3, Townsville, Qld (Australia); Boyle, Steve [Australian Institute of Marine Science, PMB 3, Townsville, Qld (Australia); Brinkman, Diane [Australian Institute of Marine Science, PMB 3, Townsville, Qld (Australia); Webster, Nicole [Australian Institute of Marine Science, PMB 3, Townsville, Qld (Australia); Biological Sciences, University of Canterbury, Christchurch (New Zealand)

    2006-10-15

    This study examined the concentrations of total hydrocarbons (THC), polychlorinated biphenyls (PCB), polyaromatic hydrocarbons (PAH), and trace metals (Cu, Zn, Cd, Pb, Hg and As) in marine sediments off Scott Base (NZ) and compared them with sediments near the highly polluted McMurdo Station (US) as well as less impacted sites including Turtle Rock and Cape Evans. The Antarctic mollusc, Laternula elliptica and three common sponge species were also analysed for trace metals. The mean THC concentration in sediments from Scott Base was 3 fold higher than the pristine site, Turtle Rock, but 10 fold lower than samples from McMurdo Station. McMurdo Station sediments also contained the highest concentrations of PAHs, PCBs and the trace metals, Cu, Zn, Pb, Cd and Hg. Copper was significantly higher in bivalves from McMurdo Station than other sites. Trace metal concentrations in sponges were generally consistent within sites but no spatial patterns were apparent. - Analyses of Antarctic marine sediments, bivalves and sponges revealed strong PAH, PCB and trace metal gradients in McMurdo Sound.

  13. Contamination in sediments, bivalves and sponges of McMurdo Sound, Antarctica

    International Nuclear Information System (INIS)

    Negri, Andrew; Burns, Kathryn; Boyle, Steve; Brinkman, Diane; Webster, Nicole

    2006-01-01

    This study examined the concentrations of total hydrocarbons (THC), polychlorinated biphenyls (PCB), polyaromatic hydrocarbons (PAH), and trace metals (Cu, Zn, Cd, Pb, Hg and As) in marine sediments off Scott Base (NZ) and compared them with sediments near the highly polluted McMurdo Station (US) as well as less impacted sites including Turtle Rock and Cape Evans. The Antarctic mollusc, Laternula elliptica and three common sponge species were also analysed for trace metals. The mean THC concentration in sediments from Scott Base was 3 fold higher than the pristine site, Turtle Rock, but 10 fold lower than samples from McMurdo Station. McMurdo Station sediments also contained the highest concentrations of PAHs, PCBs and the trace metals, Cu, Zn, Pb, Cd and Hg. Copper was significantly higher in bivalves from McMurdo Station than other sites. Trace metal concentrations in sponges were generally consistent within sites but no spatial patterns were apparent. - Analyses of Antarctic marine sediments, bivalves and sponges revealed strong PAH, PCB and trace metal gradients in McMurdo Sound

  14. Electric Vehicle Performance at McMurdo Station (Antarctica) and Comparison with McMurdo Station Conventional Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Sears, T.; Lammert, M.; Colby, K.; Walter, R.

    2014-09-01

    This report examines the performance of two electric vehicles (EVs) at McMurdo, Antarctica (McMurdo). The study examined the performance of two e-ride Industries EVs initially delivered to McMurdo on February 16, 2011, and compared their performance and fuel use with that of conventional vehicles that have a duty cycle similar to that of the EVs used at McMurdo.

  15. Homogeneity Analysis of a MEMS-based PZT Thick Film Vibration Energy Harvester Manufacturing Process

    DEFF Research Database (Denmark)

    Lei, Anders; Xu, Ruichao; Borregaard, Louise M.

    2012-01-01

    This paper presents a homogeneity analysis of a high yield wafer scale fabrication of MEMS-based unimorph silicon/PZT thick film vibration energy harvesters aimed towards vibration sources with peak vibrations in the range of around 300Hz. A wafer with a yield of 91% (41/45 devices) has been...

  16. Alex McQueen : power

    Index Scriptorium Estoniae

    1998-01-01

    A. McQueeni moevälisest tegevusest. 'American Express' tellis temalt krediitkaardi kujunduse. 1998. a. suvest ajakirja 'Dazed & Confused' abitoimetaja. A. McQueen on lubanud olla Björki (Island) video kunstiline juht.

  17. The new Central American seismic hazard zonation: Mutual consensus based on up to day seismotectonic framework

    Science.gov (United States)

    Alvarado, Guillermo E.; Benito, Belén; Staller, Alejandra; Climent, Álvaro; Camacho, Eduardo; Rojas, Wilfredo; Marroquín, Griselda; Molina, Enrique; Talavera, J. Emilio; Martínez-Cuevas, Sandra; Lindholm, Conrad

    2017-11-01

    Central America is one of the most active seismic zones in the World, due to the interaction of five tectonic plates (North America, Caribbean, Coco, Nazca and South America), and its internal deformation, which generates almost one destructive earthquakes (5.4 ≤ Mw ≤ 8.1) every year. A new seismological zonation for Central America is proposed based on seismotectonic framework, a geological context (tectonic and geological maps), geophysical and geodetic evidence (gravimetric maps, magnetometric, GPS observations), and previous works. As a main source of data a depurated earthquake catalog was collected covering the period from 1522 to 2015. This catalog was homogenized to a moment magnitude scale (Mw). After a careful analysis of all the integrated geological and seismological information, the seismogenic zones were established into seismic areas defined by similar patterns of faulting, seismicity, and rupture mechanism. The tectonic environment has required considering seismic zones in two particular seismological regimes: a) crustal faulting (including local faults, major fracture zones of plate boundary limits, and thrust fault of deformed belts) and b) subduction, taking into account the change in the subduction angle along the trench, and the type and location of the rupture. The seismicity in the subduction zone is divided into interplate and intraplate inslab seismicity. The regional seismic zonation proposed for the whole of Central America, include local seismic zonations, avoiding discontinuities at the national boundaries, because of a consensus between the 7 countries, based on the cooperative work of specialists on Central American seismotectonics and related topics.

  18. Systematic homogenization and self-consistent flux and pin power reconstruction for nodal diffusion methods. 1: Diffusion equation-based theory

    International Nuclear Information System (INIS)

    Zhang, H.; Rizwan-uddin; Dorning, J.J.

    1995-01-01

    A diffusion equation-based systematic homogenization theory and a self-consistent dehomogenization theory for fuel assemblies have been developed for use with coarse-mesh nodal diffusion calculations of light water reactors. The theoretical development is based on a multiple-scales asymptotic expansion carried out through second order in a small parameter, the ratio of the average diffusion length to the reactor characteristic dimension. By starting from the neutron diffusion equation for a three-dimensional heterogeneous medium and introducing two spatial scales, the development systematically yields an assembly-homogenized global diffusion equation with self-consistent expressions for the assembly-homogenized diffusion tensor elements and cross sections and assembly-surface-flux discontinuity factors. The rector eigenvalue 1/k eff is shown to be obtained to the second order in the small parameter, and the heterogeneous diffusion theory flux is shown to be obtained to leading order in that parameter. The latter of these two results provides a natural procedure for the reconstruction of the local fluxes and the determination of pin powers, even though homogenized assemblies are used in the global nodal diffusion calculation

  19. Assessing the use of food coloring as an appropriate visual guide for homogenously mixed capsule powders in extemporaneous compounding.

    Science.gov (United States)

    Hoffmann, Brittany; Carlson, Christie; Rao, Deepa A

    2014-01-01

    The purpose of this work was to assess the use of food colors as a visual aid to determine homogeneous mixing in the extemporaneous preparation of capsules. Six different batches of progesterone slow-release 200-mg capsules were prepared by different mixing methods until visually determined as homogeneous based on yellow food coloring distribution in the preparation by the Central Iowa Compounding Pharmacy, Des Moines, Iowa. UV-Vis spectrophotometry was used to extract and evaluate yellow food coloring content in each of these batches and compared to an in-house, small-batch geometric dilution preparation of progesterone slow- release 200-mg capsules. Of the 6 batches tested, only one, which followed the principles of additive dilution and an appropriate mixing time, was both visually and quantitatively homogeneous in the detection of yellow food coloring. The use of food coloring alone is not a valid quality-assurance tool in determining homogeneous mixing. Principles of geometric and/or additive dilution and appropriate mixing times along with the food color can serve as a quality-assurance tool.

  20. Isotopic homogeneity of iron in the early solar nebula.

    Science.gov (United States)

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  1. McUniversities Revisited: A Comparison of University and McDonald's Casual Employee Experiences in Australia

    Science.gov (United States)

    Nadolny, Andrew; Ryan, Suzanne

    2015-01-01

    The McDonaldization of higher education refers to the transformation of universities from knowledge generators to rational service organizations or "McUniversities". This is reflected in the growing dependence on a casualized academic workforce. The article explores the extent to which the McDonaldization thesis applies to universities…

  2. Synthesis and characterization of homogeneous interstitial solutions of nitrogen and carbon in iron-based lattices

    DEFF Research Database (Denmark)

    Brink, Bastian Klüge

    work in synthesis and characterization of interstitial solutions ofnitrogen and carbon in iron-based lattices. In order to avoid the influences of gradients incomposition and residual stresses, which are typically found in treated surface layers,homogenous samples are needed. These were prepared from...

  3. MC2-3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. H. [Argonne National Lab. (ANL), Argonne, IL (United States); Yang, W. S. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-11-08

    The MC2-3 code is a Multigroup Cross section generation Code for fast reactor analysis, developed by improving the resonance self-shielding and spectrum calculation methods of MC2-2 and integrating the one-dimensional cell calculation capabilities of SDX. The code solves the consistent P1 multigroup transport equation using basic neutron data from ENDF/B data files to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (~2000) or hyperfine (~400,000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified isotopic temperatures. The pointwise cross sections are directly used in the hyperfine group calculation whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for two-dimensional whole-core problems to generate region-dependent broad-group cross sections. Multigroup cross sections are written in the ISOTXS format for a user-specified group structure. The code is executable on UNIX, Linux, and PC Windows systems, and its library includes all isotopes of the ENDF/BVII. 0 data.

  4. A game-theoretic formulation of the homogeneous self-reconfiguration problem

    KAUST Repository

    Pickem, Daniel; Egerstedt, Magnus; Shamma, Jeff S.

    2015-01-01

    In this paper we formulate the homogeneous two- and three-dimensional self-reconfiguration problem over discrete grids as a constrained potential game. We develop a game-theoretic learning algorithm based on the Metropolis-Hastings algorithm that solves the self-reconfiguration problem in a globally optimal fashion. Both a centralized and a fully decentralized algorithm are presented and we show that the only stochastically stable state is the potential function maximizer, i.e. the desired target configuration. These algorithms compute transition probabilities in such a way that even though each agent acts in a self-interested way, the overall collective goal of self-reconfiguration is achieved. Simulation results confirm the feasibility of our approach and show convergence to desired target configurations.

  5. A game-theoretic formulation of the homogeneous self-reconfiguration problem

    KAUST Repository

    Pickem, Daniel

    2015-12-15

    In this paper we formulate the homogeneous two- and three-dimensional self-reconfiguration problem over discrete grids as a constrained potential game. We develop a game-theoretic learning algorithm based on the Metropolis-Hastings algorithm that solves the self-reconfiguration problem in a globally optimal fashion. Both a centralized and a fully decentralized algorithm are presented and we show that the only stochastically stable state is the potential function maximizer, i.e. the desired target configuration. These algorithms compute transition probabilities in such a way that even though each agent acts in a self-interested way, the overall collective goal of self-reconfiguration is achieved. Simulation results confirm the feasibility of our approach and show convergence to desired target configurations.

  6. Venturi Wet Gas Flow Modeling Based on Homogeneous and Separated Flow Theory

    Directory of Open Access Journals (Sweden)

    Xu Ying

    2008-10-01

    Full Text Available When Venturi meters are used in wet gas, the measured differential pressure is higher than it would be in gas phases flowing alone. This phenomenon is called over-reading. Eight famous over-reading correlations have been studied by many researchers under low- and high-pressure conditions, the conclusion is separated flow model and homogeneous flow model performing well both under high and low pressures. In this study, a new metering method is presented based on homogeneous and separated flow theory; the acceleration pressure drop and the friction pressure drop of Venturi under two-phase flow conditions are considered in new correlation, and its validity is verified through experiment. For low pressure, a new test program has been implemented in Tianjin University’s low-pressure wet gas loop. For high pressure, the National Engineering Laboratory offered their reports on the web, so the coefficients of the new proposed correlation are fitted with all independent data both under high and low pressures. Finally, the applicability and errors of new correlation are analyzed.

  7. NGRI LAM-MC-ICPMS National Facility: reproducibility of Sr, Nd and Hf isotopic measurements

    International Nuclear Information System (INIS)

    Bhaskar Rao, Y.J.; Vijaya Gopal, B.; Babu, E.V.S.S.K.; Sukumaran, N.P.; Sreenivas, B.; Vijaya Kumar, T.; Krishna, K.V.S.S.; Tomson, J.K.

    2009-01-01

    A laboratory facility was established at the NGRI, primarily to support research in Isotope Geochemistry and Geochronology. Central to this facility are a Multiple Collector-Inductively Coupled Plasma Mass Spectrometer (MC-ICPMS: Nu Plasma HR, Nu Instruments, UK) and a 213 nm Nd-YAG UV Laser Ablation Microprobe (LAM: UP-213, New Wave Research, USA) and a clean chemistry laboratory for dissolution and chromatographic extraction of a range of elements. This article presents a summary of the accuracy and precision of MC-ICPMS Sr, Nd and Hf isotopic measurements (solution mode) on Standard Reference Materials: SRM-987, JNd i and JMC-475 respectively, measured between October 2007 and August 2009

  8. Solar Imagery - Composites - Synoptic Maps - McIntosh

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — In 1964 (solar cycle 20) Patrick McIntosh began creating hand-drawn synoptic maps of solar activity, based on Hydrogen alpha (H?) imaging measurements. These...

  9. Non-homogeneous harmonic analysis: 16 years of development

    Science.gov (United States)

    Volberg, A. L.; Èiderman, V. Ya

    2013-12-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles.

  10. EIGENVECTOR-BASED CENTRALITY MEASURES FOR TEMPORAL NETWORKS*

    Science.gov (United States)

    TAYLOR, DANE; MYERS, SEAN A.; CLAUSET, AARON; PORTER, MASON A.; MUCHA, PETER J.

    2017-01-01

    Numerous centrality measures have been developed to quantify the importances of nodes in time-independent networks, and many of them can be expressed as the leading eigenvector of some matrix. With the increasing availability of network data that changes in time, it is important to extend such eigenvector-based centrality measures to time-dependent networks. In this paper, we introduce a principled generalization of network centrality measures that is valid for any eigenvector-based centrality. We consider a temporal network with N nodes as a sequence of T layers that describe the network during different time windows, and we couple centrality matrices for the layers into a supra-centrality matrix of size NT × NT whose dominant eigenvector gives the centrality of each node i at each time t. We refer to this eigenvector and its components as a joint centrality, as it reflects the importances of both the node i and the time layer t. We also introduce the concepts of marginal and conditional centralities, which facilitate the study of centrality trajectories over time. We find that the strength of coupling between layers is important for determining multiscale properties of centrality, such as localization phenomena and the time scale of centrality changes. In the strong-coupling regime, we derive expressions for time-averaged centralities, which are given by the zeroth-order terms of a singular perturbation expansion. We also study first-order terms to obtain first-order-mover scores, which concisely describe the magnitude of nodes’ centrality changes over time. As examples, we apply our method to three empirical temporal networks: the United States Ph.D. exchange in mathematics, costarring relationships among top-billed actors during the Golden Age of Hollywood, and citations of decisions from the United States Supreme Court. PMID:29046619

  11. 75 FR 27286 - McKelvie Geographic Area Range Allotment Management Planning on the Samuel R. McKelvie National...

    Science.gov (United States)

    2010-05-14

    ... range allotment management planning on the McKelvie Geographic Area, Samuel R. McKelvie National Forest... DEPARTMENT OF AGRICULTURE Forest Service McKelvie Geographic Area Range Allotment Management Planning on the Samuel R. McKelvie National Forest, Bessey Ranger District in Nebraska AGENCY: Forest...

  12. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    Science.gov (United States)

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  13. MC Sensor—A Novel Method for Measurement of Muscle Tension

    Directory of Open Access Journals (Sweden)

    Sašo Tomažič

    2011-09-01

    Full Text Available This paper presents a new muscle contraction (MC sensor. This MC sensor is based on a novel principle whereby muscle tension is measured during muscle contractions. During the measurement, the sensor is fixed on the skin surface above the muscle, while the sensor tip applies pressure and causes an indentation of the skin and intermediate layer directly above the muscle and muscle itself. The force on the sensor tip is then measured. This force is roughly proportional to the tension of the muscle. The measurement is non-invasive and selective. Selectivity of MC measurement refers to the specific muscle or part of the muscle that is being measured and is limited by the size of the sensor tip. The sensor is relatively small and light so that the measurements can be performed while the measured subject performs different activities. Test measurements with this MC sensor on the biceps brachii muscle under isometric conditions (elbow angle 90° showed a high individual linear correlation between the isometric force and MC signal amplitudes (0.97 ≤ r ≤ 1. The measurements also revealed a strong correlation between the MC and electromyogram (EMG signals as well as good dynamic behaviour by the MC sensor. We believe that this MC sensor, when fully tested, will be a useful device for muscle mechanic diagnostics and that it will be complementary to existing methods.

  14. Performance Analysis of Wavelet Based MC-CDMA System with Implementation of Various Antenna Diversity Schemes

    OpenAIRE

    Islam, Md. Matiqul; Kabir, M. Hasnat; Ullah, Sk. Enayet

    2012-01-01

    The impact of using wavelet based technique on the performance of a MC-CDMA wireless communication system has been investigated. The system under proposed study incorporates Walsh Hadamard codes to discriminate the message signal for individual user. A computer program written in Mathlab source code is developed and this simulation study is made with implementation of various antenna diversity schemes and fading (Rayleigh and Rician) channel. Computer simulation results demonstrate that the p...

  15. Computation of the locus crossing point location of MC circuit

    International Nuclear Information System (INIS)

    Liu Hai-Jun; Li Zhi-Wei; Bu Kai; Sun Zhao-Lin; Nie Hong-Shan

    2014-01-01

    In this paper, the crossing point property of the i–v hysteresis curve in a memristor–capacitor (MC) circuit is analyzed. First, the ideal passive memristor on the crossing point property of i–v hysteresis curve is studied. Based on the analysis, the analytical derivation with respect to the crossing point location of MC circuit is given. Then the example of MC with linear memristance-versus-charge state map is demonstrated to discuss the drift property of cross-point location, caused by the frequency and capacitance value. (interdisciplinary physics and related areas of science and technology)

  16. A virtual-accelerator-based verification of a Monte Carlo dose calculation algorithm for electron beam treatment planning in homogeneous phantoms

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2006-01-01

    By introducing Monte Carlo (MC) techniques to the verification procedure of dose calculation algorithms in treatment planning systems (TPSs), problems associated with conventional measurements can be avoided and properties that are considered unmeasurable can be studied. The aim of the study is to implement a virtual accelerator, based on MC simulations, to evaluate the performance of a dose calculation algorithm for electron beams in a commercial TPS. The TPS algorithm is MC based and the virtual accelerator is used to study the accuracy of the algorithm in water phantoms. The basic test of the implementation of the virtual accelerator is successful for 6 and 12 MeV (γ < 1.0, 0.02 Gy/2 mm). For 18 MeV, there are problems in the profile data for some of the applicators, where the TPS underestimates the dose. For fields equipped with patient-specific inserts, the agreement is generally good. The exception is 6 MeV where there are slightly larger deviations. The concept of the virtual accelerator is shown to be feasible and has the potential to be a powerful tool for vendors and users

  17. Michel Trottier-McDonald

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics. Michel Trottier-McDonald. Articles written in Pramana – Journal of Physics. Volume 79 Issue 5 November 2012 pp 1337-1340 Poster Presentations. Tau reconstruction, energy calibration and identification at ATLAS · Michel Trottier-McDonald on behalf of the ATLAS ...

  18. APLIKASI ANALISIS DISKRIMINAN DALAM MENENTUKAN KEPUTUSAN PEMBELIAN PRODUK McCafe (Studi Kasus: McDonald’s Jimbaran Bali

    Directory of Open Access Journals (Sweden)

    TRISNA RAMADHAN

    2018-02-01

    Full Text Available McDonald’s is one of fast food company that is growing rapidly. McDonald’s continues to innovate to satisfy customers. It introduced the concept of a cafe with the name McCafe. Because of the competition with other fast food restaurants, McDonald’s needs to improve the quality of McCafe favored by customers. Thus, this research was conducted to aim at getting the indicators that are best describing customers characteristic. This research used discriminant analysis methods. Discriminant analysis was used to classify customers into groups of loyal customers or non loyal customers.. The indicators that distinguished the decision of the customer to buy McCafe Jimbaran product were affordable prices and locations that are easily accessible to customers. The formed discriminant function had an accuracy of 91,67 percent in classifying the customers.

  19. Central Rotations of Milky Way Globular Clusters

    Science.gov (United States)

    Fabricius, Maximilian H.; Noyola, Eva; Rukdee, Surangkhana; Saglia, Roberto P.; Bender, Ralf; Hopp, Ulrich; Thomas, Jens; Opitsch, Michael; Williams, Michael J.

    2014-06-01

    Most Milky Way globular clusters (GCs) exhibit measurable flattening, even if on a very low level. Both cluster rotation and tidal fields are thought to cause this flattening. Nevertheless, rotation has only been confirmed in a handful of GCs, based mostly on individual radial velocities at large radii. We are conducting a survey of the central kinematics of Galactic GCs using the new Integral Field Unit instrument VIRUS-W. We detect rotation in all 11 GCs that we have observed so far, rendering it likely that a large majority of the Milky Way GCs rotate. We use published catalogs of GCs to derive central ellipticities and position angles. We show that in all cases where the central ellipticity permits an accurate measurement of the position angle, those angles are in excellent agreement with the kinematic position angles that we derive from the VIRUS-W velocity fields. We find an unexpected tight correlation between central rotation and outer ellipticity, indicating that rotation drives flattening for the objects in our sample. We also find a tight correlation between central rotation and published values for the central velocity dispersion, most likely due to rotation impacting the old dispersion measurements. This Letter includes data taken at The McDonald Observatory of The University of Texas at Austin.

  20. McGuire snubber elimination program

    International Nuclear Information System (INIS)

    Cloud, R.L.; Leung, J.S.M.; Taylor, W.H.; Morgan, R.L. Jr.

    1993-01-01

    An engineering program has been initiated at McGuire Nuclear Stations 1 and 2 to eliminate all existing snubbers. The elimination is achieved by replacing existing snubbers with limit stop pipe supports. The program establishes plant-wide modification procedures for one-to-one substitution under the 10 CFR 50.59 requirement. Nuclear Regulatory Commission (NRC) acceptance is based on the results of both comparison analyses and the hardware implementation of sample piping systems at McGuire nuclear stations. Experimental results obtained on shake table testing and from the NRC sponsored HDR research program are also used to formulate the technical basis and design procedures for plant-wide implementation of the snubber replacement effort. The overall program plan is for nearly 3,000 snubbers to be replaced in phases consistent with the plant scheduled outages. Duke Power estimates the program, when completed, will maintain ALARA, improve reliability, and reduce plant operating costs

  1. Experimental and numerical investigation of hetero-/homogeneous combustion-based HCCI of methane–air mixtures in free-piston micro-engines

    International Nuclear Information System (INIS)

    Chen, Junjie; Liu, Baofang; Gao, Xuhui; Xu, Deguang

    2016-01-01

    Highlights: • Single-shot experiments and a transient model of micro-engine were presented. • Coupled combustion can significantly improve in-cylinder temperatures. • Coupled combustion can reduce mass losses and compression ratios. • Heterogeneous reactions cause earlier ignition. • Heat losses result in higher mass losses. - Abstract: The hetero-/homogenous combustion-based HCCI (homogeneous charge compression ignition) of fuel–lean methane–air mixtures over alumina-supported platinum catalysts was investigated experimentally and numerically in free-piston micro-engines without ignition sources. Single-shot experiments were carried out in the purely homogeneous and coupled hetero-/homogeneous combustion modes, involved temperature measurements, capturing the visible combustion image sequences, exhaust gas analysis, and the physicochemical characterization of catalysts. Simulations were performed with a two-dimensional transient model that includes detailed hetero-/homogeneous chemistry and transport, leakage, and free-piston motion to gain physical insight and to explore the hetero-/homogeneous combustion characteristics. The micro-engine performance concerning combustion efficiency, mass loss, energy density, and free-piston dynamics was investigated. The results reveal that both purely homogeneous and coupled hetero-/homogeneous combustion of methane–air mixtures in a narrow cylinder with a diameter of 3 mm and a height of approximately 0.3 mm are possible. The coupled hetero-/homogeneous mode can not only significantly improve the combustion efficiency, in-cylinder temperature and pressure, output power and energy density, but also reduce the mass loss because of its lower compression ratio and less time spent around TDC (top dead center) and during the expansion stroke, indicating that this coupled mode is a promising combustion scheme for micro-engine. Heat losses result in higher mass losses. Heterogeneous reactions cause earlier ignition

  2. Assessment of dose homogeneity in conformal interstitial breast brachytherapy with special respect to ICRU recommendations

    Directory of Open Access Journals (Sweden)

    Tibor Major

    2011-09-01

    Full Text Available Purpose: To present the results of dose homogeneity analysis for breast cancer patients treated with image-basedconformal interstitial brachytherapy, and to investigate the usefulness of the ICRU recommendations. Material and methods: Treatment plans of forty-nine patients who underwent partial breast irradiation with interstitialbrachytherapy were analyzed. Quantitative parameters were used to characterize dose homogeneity. Dose nonuniformityratio (DNR, dose homogeneity index (DHI, uniformity index (UI and quality index (QI were calculated.Furthermore, parameters recommended by the ICRU 58 such as minimum target dose (MTD, mean central dose (MCD,high dose volume, low dose volume and the spread between local minimum doses were determined. Correlationsbetween the calculated homogeneity parameters and usefulness of the ICRU parameters in image-based brachytherapywere investigated. Results: Catheters with mean number of 15 (range: 6-25 were implanted in median 4 (range: 3-6 planes. The volu -me of the PTV ranged from 15.5 cm3 to 176 cm3. The mean DNR was 0.32, the DHI 0.66, the UI 1.49 and the QI 1.94. Relatedto the prescribed dose, the MTD was 69% and the MCD 135%. The mean high dose volume was 8.1 cm3 (10%, whilethe low dose volume was 63.8 cm3 (96%. The spread between minimum doses in central plane ranged from –14% to+20%. Good correlation was found between the DNR and the DHI (R2 = 0.7874, and the DNR correlated well with theUI (R2 = 0.7615 also. No correlation was found between the ICRU parameters and any other volumetric parameters. Conclusions: To characterize the dose uniformity in high-dose rate breast implants, DVH-related homogeneityparameters representing the full 3D dose distributions are mandatory to be used. In many respects the current re commendationsof the ICRU Report 58 are already outdated, and it is well-timed to set up new recommendations, whichare more feasible for image-guided conformal interstitial brachytherapy.

  3. Validation of the intrinsic spatial efficiency method for non cylindrical homogeneous sources using MC simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés [Departamento de Física, Facultad de Ciencias, Universidad de Chile (Chile)

    2016-07-07

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of the intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.

  4. Vnímanie značky McDonald's na českom trhu

    OpenAIRE

    Harčár, Tomáš

    2015-01-01

    The bachelor thesis explores perception of the brand McDonald's on the Czech market. The objective of this thesis is to test via questionnaire an assumption, which is based on belief that customers find food sold in McDonald's restaurant unhealthy and of poor quality. The thesis contains a theoretic part which presents a basic definition linked to brand building and brand image. Those definitions are further used to explain advertising strategy of McDonald's company. Concluding chapter evalua...

  5. Bayesian inversion of surface-wave data for radial and azimuthal shear-wave anisotropy, with applications to central Mongolia and west-central Italy

    Science.gov (United States)

    Ravenna, Matteo; Lebedev, Sergei

    2018-04-01

    Seismic anisotropy provides important information on the deformation history of the Earth's interior. Rayleigh and Love surface-waves are sensitive to and can be used to determine both radial and azimuthal shear-wave anisotropies at depth, but parameter trade-offs give rise to substantial model non-uniqueness. Here, we explore the trade-offs between isotropic and anisotropic structure parameters and present a suite of methods for the inversion of surface-wave, phase-velocity curves for radial and azimuthal anisotropies. One Markov chain Monte Carlo (McMC) implementation inverts Rayleigh and Love dispersion curves for a radially anisotropic shear velocity profile of the crust and upper mantle. Another McMC implementation inverts Rayleigh phase velocities and their azimuthal anisotropy for profiles of vertically polarized shear velocity and its depth-dependent azimuthal anisotropy. The azimuthal anisotropy inversion is fully non-linear, with the forward problem solved numerically at different azimuths for every model realization, which ensures that any linearization biases are avoided. The computations are performed in parallel, in order to reduce the computing time. The often challenging issue of data noise estimation is addressed by means of a Hierarchical Bayesian approach, with the variance of the noise treated as an unknown during the radial anisotropy inversion. In addition to the McMC inversions, we also present faster, non-linear gradient-search inversions for the same anisotropic structure. The results of the two approaches are mutually consistent; the advantage of the McMC inversions is that they provide a measure of uncertainty of the models. Applying the method to broad-band data from the Baikal-central Mongolia region, we determine radial anisotropy from the crust down to the transition-zone depths. Robust negative anisotropy (Vsh < Vsv) in the asthenosphere, at 100-300 km depths, presents strong new evidence for a vertical component of asthenospheric

  6. Microstructural and Microhardness Evolution from Homogenization and Hot Isostatic Pressing on Selective Laser Melted Inconel 718: Structure, Texture, and Phases

    Directory of Open Access Journals (Sweden)

    Raiyan Seede

    2018-05-01

    Full Text Available In this work, the microstructure, texture, phases, and microhardness of 45° printed (with respect to the build direction homogenized, and hot isostatically pressed (HIP cylindrical IN718 specimens are investigated. Phase morphology, grain size, microhardness, and crystallographic texture at the bottom of each specimen differ from those of the top due to changes in cooling rate. High cooling rates during the printing process generated a columnar grain structure parallel to the building direction in the as-printed condition with a texture transition from (001 orientation at the bottom of the specimen to (111 orientation towards the specimen top based on EBSD analysis. A mixed columnar and equiaxed grain structure associated with about a 15% reduction in texture is achieved after homogenization treatment. HIP treatment caused significant grain coarsening, and engendered equiaxed grains with an average diameter of 154.8 µm. These treatments promoted the growth of δ-phase (Ni3Nb and MC-type brittle (Ti, NbC carbides at grain boundaries. Laves phase (Fe2Nb was also observed in the as-printed and homogenized specimens. Ostwald ripening of (Ti, NbC carbides caused excessive grain growth at the bottom of the HIPed IN718 specimens, while smaller grains were observed at their top. Microhardness in the as-fabricated specimens was 236.9 HV and increased in the homogenized specimens by 19.3% to 282.6 HV due to more even distribution of secondary precipitates, and the nucleation of smaller grains. A 36.1% reduction in microhardness to 180.5 HV was found in the HIPed condition due to   γ ″ phase dissolution and differences in grain morphology.

  7. 7 CFR 58.920 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.920 Section 58.920 Agriculture... Procedures § 58.920 Homogenization. Where applicable concentrated products shall be homogenized for the... homogenization and the pressure at which homogenization is accomplished will be that which accomplishes the most...

  8. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    2015-01-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed. This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.

  9. Evaluation of uranium anomalies in the McCaslin Syncline, northeastern Wisconsin

    International Nuclear Information System (INIS)

    Blackburn, W.H.; Mathews, G.W.

    1982-01-01

    On the basis of this investigation, the McCaslin area does not demonstrate sufficient recognition criteria to be considered favorable for the occurrence of uranium deposits that are analogous to quartz-pebble conglomerate deposits or unconformity-related deposits. Neither model is applicable because: (1) More often than not the conglomeratic lenses of the McCaslin Quartzite are polymictic. (2) Pyrite and chlorite are essentially lacking; (3) Pervasive chloritic and hematitic alteration are not present in the McCaslin district; (4) The maximum estimated age of the McCaslin is 1.9 b.y.; (5) The notable absence of anomalous uranium values in the basal McCaslin Quartzite, and especially where known faulting intersects the McCaslin, suggests that a process for concentrating uranium has not been effective in the area; (6) The Waupee Volcanics are not known to contain any extensive graphitic or chloritic schists, nor do they have more than normal amounts of uranium; (7) Uranium anomalies in stream-water, stream-sediment, and aerial surveys may be explained by uranium derived from the relatively uraniferous Hager Rhyolite, or other granitic intrusives related to the Wolf River batholith. The McCaslin area warrants no further investigation; it will not contribute any significant new potential the the NURE resource base. 3 figures, 3 tables

  10. Chemical stratigraphy of Grande Ronde Basalt, Pasco Basin, south-central Washington

    International Nuclear Information System (INIS)

    Long, P.E.; Ledgerwood, R.K.; Myers, C.W.; Reidel, S.P.; Landon, R.D.; Hooper, P.R.

    1980-02-01

    Grande Ronde Basalt in the Pasco Basin, south-central Washington, can be subdivided into three chemical types and two chemical subtypes based on x-ray fluorescence major element analysis of samples from seven deep core holes and three surface sections. These chemical types are: (1) high-Mg Grande Ronde chemical type; (2) low-Mg Grande Ronde chemical type; (3) low-K (very high-Mg.) Grande Ronde chemical type; and (4) Umtanum Grande Ronde chemical subtype. A possible fifth subdivision is the McCoy Canyon Grande Ronde chemical subtype. The Umtanum and the McCoy Canyon subtypes are both single flows which belong to the low Mg and high-Mg chemical types, respectively. These subdivisions are all distinguished on a plot of MgO versus TiO 2 and/or MgO versus P 2 O 5 , but other major and minor elements, as well as trace elements, also reflect consistent chemical differences between the chemical types. Identification of these chemical types in the Pasco Basin subsurface shows that the high-Mg and low-Mg chemical types are ubiquitous, but the low-K chemical type is limited to the central, southern, and eastern parts of the basin. The Umtanum chemical subtype is present throughout the Pasco Basin subsurface, although it thins in the northeastern part of the basin and is apparently absent from surface exposures 40 kilometers (25 miles) north of the basin. The McCoy Canyon chemical subtype is also present throughout the basin

  11. Matrix-dependent multigrid-homogenization for diffusion problems

    Energy Technology Data Exchange (ETDEWEB)

    Knapek, S. [Institut fuer Informatik tu Muenchen (Germany)

    1996-12-31

    We present a method to approximately determine the effective diffusion coefficient on the coarse scale level of problems with strongly varying or discontinuous diffusion coefficients. It is based on techniques used also in multigrid, like Dendy`s matrix-dependent prolongations and the construction of coarse grid operators by means of the Galerkin approximation. In numerical experiments, we compare our multigrid-homogenization method with homogenization, renormalization and averaging approaches.

  12. The Effects of Spatial Diversity and Imperfect Channel Estimation on Wideband MC-DS-CDMA and MC-CDMA

    Science.gov (United States)

    2009-10-01

    In our previous work, we compared the theoretical bit error rates of multi-carrier direct sequence code division multiple access (MC- DS - CDMA ) and...consider only those cases where MC- CDMA has higher frequency diversity than MC- DS - CDMA . Since increases in diversity yield diminishing gains, we conclude

  13. Topology optimization based design of unilateral NMR for generating a remote homogeneous field.

    Science.gov (United States)

    Wang, Qi; Gao, Renjing; Liu, Shutian

    2017-06-01

    This paper presents a topology optimization based design method for the design of unilateral nuclear magnetic resonance (NMR), with which a remote homogeneous field can be obtained. The topology optimization is actualized by seeking out the optimal layout of ferromagnetic materials within a given design domain. The design objective is defined as generating a sensitive magnetic field with optimal homogeneity and maximal field strength within a required region of interest (ROI). The sensitivity of the objective function with respect to the design variables is derived and the method for solving the optimization problem is presented. A design example is provided to illustrate the utility of the design method, specifically the ability to improve the quality of the magnetic field over the required ROI by determining the optimal structural topology for the ferromagnetic poles. Both in simulations and experiments, the sensitive region of the magnetic field achieves about 2 times larger than that of the reference design, validating validates the feasibility of the design method. Copyright © 2017. Published by Elsevier Inc.

  14. Qualification test of few group constants generated from an MC method by the two-step neutronics analysis system McCARD/MASTER

    International Nuclear Information System (INIS)

    Park, Ho Jin; Shim, Hyung Jin; Joo, Han Gyu; Kim, Chang Hyo

    2011-01-01

    The purpose of this paper is to examine the qualification of few group constants estimated by the Seoul National University Monte Carlo particle transport analysis code McCARD in terms of core neutronics analyses and thus to validate the McCARD method as a few group constant generator. The two- step core neutronics analyses are conducted for a mini and a realistic PWR by the McCARD/MASTER code system in which McCARD is used as an MC group constant generation code and MASTER as a diffusion core analysis code. The two-step calculations for the effective multiplication factors and assembly power distributions of the two PWR cores by McCARD/MASTER are compared with the reference McCARD calculations. By showing excellent agreements between McCARD/MASTER and the reference MC core neutronics analyses for the two PWRs, it is concluded that the MC method implemented in McCARD can generate few group constants which are well qualified for high-accuracy two-step core neutronics calculations. (author)

  15. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  16. OpenMC In Situ Source Convergence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldrich, Garrett Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Univ. of California, Davis, CA (United States); Dutta, Soumya [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); The Ohio State Univ., Columbus, OH (United States); Woodring, Jonathan Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are able to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.

  17. McCullough to Liberty fiber optics project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The US Department of Energy, Western Area Power Administration (Western) proposes to replace an existing overhead static wire with a shield wire that contains optical fibers (OPGW) on transmission lines from McCullough Substation, south of Las Vegas, Nevada, to Liberty Substation near Phoenix, Arizona. The replacement will occur on the McCullough-Davis, Davis-Parker No. 2, and Parker-Liberty No. 1 230-kV transmission lines. Western is responsible for the operation and maintenance of the lines. Western prepared an Environmental Assessment (EA) entitled ``McCullough to Liberty Fiber Optics Project`` (DOE/EA-1202). The EA contains the analysis of the proposed construction, operation, and maintenance of the OPGW. Based on the analysis in the EA, Western finds that the proposed action is not a major Federal action significantly affecting the quality of the human environment, within the meaning of the National Environmental Policy Act (NEPA) of 1969. The preparation of an environmental impact statement (EIS) is not required, and therefore, Western is issuing this Findings of No Significant Impact (FONSI).

  18. McCullough to Liberty fiber optics project

    International Nuclear Information System (INIS)

    1997-05-01

    The US Department of Energy, Western Area Power Administration (Western) proposes to replace an existing overhead static wire with a shield wire that contains optical fibers (OPGW) on transmission lines from McCullough Substation, south of Las Vegas, Nevada, to Liberty Substation near Phoenix, Arizona. The replacement will occur on the McCullough-Davis, Davis-Parker No. 2, and Parker-Liberty No. 1 230-kV transmission lines. Western is responsible for the operation and maintenance of the lines. Western prepared an Environmental Assessment (EA) entitled ''McCullough to Liberty Fiber Optics Project'' (DOE/EA-1202). The EA contains the analysis of the proposed construction, operation, and maintenance of the OPGW. Based on the analysis in the EA, Western finds that the proposed action is not a major Federal action significantly affecting the quality of the human environment, within the meaning of the National Environmental Policy Act (NEPA) of 1969. The preparation of an environmental impact statement (EIS) is not required, and therefore, Western is issuing this Findings of No Significant Impact (FONSI)

  19. Non-homogeneous harmonic analysis: 16 years of development

    International Nuclear Information System (INIS)

    Volberg, A L; Èiderman, V Ya

    2013-01-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles

  20. Regional circulation around Heard and McDonald Islands and through the Fawn Trough, central Kerguelen Plateau

    Science.gov (United States)

    van Wijk, Esmee M.; Rintoul, Stephen R.; Ronai, Belinda M.; Williams, Guy D.

    2010-05-01

    The fine-scale circulation around the Heard and McDonald Islands and through the Fawn Trough, Kerguelen Plateau, is described using data from three high-resolution CTD sections, Argo floats and satellite maps of chlorophyll a, sea surface temperature (SST) and absolute sea surface height (SSH). We confirm that the Polar Front (PF) is split into two branches over the Kerguelen Plateau, with the NPF crossing the north-eastern limits of our survey carrying 25 Sv to the southeast. The SPF was associated with a strong eastward-flowing jet carrying 12 Sv of baroclinic transport through the deepest part of Fawn Trough (relative to the bottom). As the section was terminated midway through the trough this estimate is very likely to be a lower bound for the total transport. We demonstrate that the SPF contributes to the Fawn Trough Current identified by previous studies. After exiting the Fawn Trough, the SPF crossed Chun Spur and continued as a strong north-westward flowing jet along the eastern flank of the Kerguelen Plateau before turning offshore between 50°S and 51.5°S. Measured bottom water temperatures suggest a deep water connection between the northern and southern parts of the eastern Kerguelen Plateau indicating that the deep western boundary current continues at least as far north as 50.5°S. Analysis of satellite altimetry derived SSH streamlines demonstrates a southward shift of both the northern and southern branches of the Polar Front from 1994 to 2004. In the direct vicinity of the Heard and McDonald islands, cool waters of southern origin flow along the Heard Island slope and through the Eastern Trough bringing cold Winter Water (WW) onto the plateau. Complex topography funnels flow through canyons, deepens the mixed layer and increases productivity, resulting in this area being the preferred foraging region for a number of satellite-tracked land-based predators.

  1. Life-cycle-Based (LCB) online acquisition framework for supporting Mass Customisation (MC) in practice.

    OpenAIRE

    Tang, S. J.; Tjahjono, Benny; Kay, John M.

    2006-01-01

    Mass Customisation (MC) has been perceived in many articles as a strategy of choice for any company. However, Mass Customisation (MC) can be easily discussed at a strategic level; but it is rather more complicated to undertake it organisationally and operationally. The aim of this paper is to explore an effective framework that can support the development of Mass Customisation approaches. Two main contributions are addressed in this paper. One is to prove the insufficiency o...

  2. The McDonaldization of Higher Education.

    Science.gov (United States)

    Hayes, Dennis, Ed.; Wynyard, Robin, Ed.

    The essays in this collection discuss the future of the university in the context of the "McDonaldization" of society and of academia. The idea of McDonaldization, a term coined by G. Ritzer (1998), provides a tool for looking at the university and its inevitable changes. The chapters are: (1) "Enchanting McUniversity: Toward a…

  3. McDonaldizing Spirituality: Mindfulness, Education, and Consumerism

    Science.gov (United States)

    Hyland, Terry

    2017-01-01

    The exponential growth of mindfulness-based interventions (MBIs) in recent years has resulted in a marketisation and commodification of practice--popularly labeled "McMindfulness"--which divorces mindfulness from its spiritual and ethical origins in Buddhist traditions. Such commodification is criticized by utilising ideas and insights…

  4. McDonaldization and Job Insecurity

    OpenAIRE

    Emeka W. Dumbili

    2013-01-01

    The article examines how and why the McDonaldization of banking system in Nigeria engenders job insecurity. This is imperative because it provides an explicit revelation of the root causes of job insecurity in the sector that other scholars have totally omitted. No Nigerian scholar has applied the thesis in relation to job insecurity, which is the major problem in Nigeria’s banking industry. The article based on the an...

  5. Fault Diagnosis of Supervision and Homogenization Distance Based on Local Linear Embedding Algorithm

    Directory of Open Access Journals (Sweden)

    Guangbin Wang

    2015-01-01

    Full Text Available In view of the problems of uneven distribution of reality fault samples and dimension reduction effect of locally linear embedding (LLE algorithm which is easily affected by neighboring points, an improved local linear embedding algorithm of homogenization distance (HLLE is developed. The method makes the overall distribution of sample points tend to be homogenization and reduces the influence of neighboring points using homogenization distance instead of the traditional Euclidean distance. It is helpful to choose effective neighboring points to construct weight matrix for dimension reduction. Because the fault recognition performance improvement of HLLE is limited and unstable, the paper further proposes a new local linear embedding algorithm of supervision and homogenization distance (SHLLE by adding the supervised learning mechanism. On the basis of homogenization distance, supervised learning increases the category information of sample points so that the same category of sample points will be gathered and the heterogeneous category of sample points will be scattered. It effectively improves the performance of fault diagnosis and maintains stability at the same time. A comparison of the methods mentioned above was made by simulation experiment with rotor system fault diagnosis, and the results show that SHLLE algorithm has superior fault recognition performance.

  6. Homogenization models for thin rigid structured surfaces and films.

    Science.gov (United States)

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  7. Falling from the Past. Geographies of exceptionalism in two novels by Jay McInerney

    Directory of Open Access Journals (Sweden)

    Fiorenzo Iuliano

    2011-09-01

    Full Text Available Celebrating the glamorous 1980s, Jay McInerney has described the fall of the ambitions and delusions of yuppies in New York City. The vibrant atmosphere of his debut novel (Bright Lights, Big City, 1984 comes to a sudden end in Brightness Falls (1987, where the 1987 stock market crash is prophesied and narrated in its consequences on the lives of a young, brilliant couple, Corrine and Russell Calloway. Almost twenty years later, in The Good Life (2006, McInerney takes up Russell’s and Corrine’s stories again, now in the aftermath of September 11. This article focuses on the symbolic economy of the US territory. The 1980s, as they have been represented in Brightness Falls, witnessed the boisterous celebration of New York City and its centrality in the imaginary geography of the USA. When New York apparently starts crumbling under the terrorist attacks, the protagonists of The Good Life ideally (and sometimes physically go back to their native places. In particular, one of the novel’s central characters, Luke McGavock, who starts an affair with Corrine while they are both volunteering at Ground Zero, returns to his native Tennessee, where is confronted with the memory of the Civil War. From this moment on, the novel starts tracing an implicit and highly thought-provoking parallel between the defeated nineteenth-century South and the synecdochic New York City at the turn of the twenty-first century, whose crash has engendered the dramatic need for the US to face the burden of its own history.

  8. McKenzie River Subbasin Assessment, Technical Report 2000.

    Energy Technology Data Exchange (ETDEWEB)

    Alsea Geospatial, Inc.

    2000-02-01

    This document details the findings of the McKenzie River Subbasin Assessment team. The goal of the subbasin assessment is to provide an ecological assessment of the McKenzie River Floodplain, identification of conservation and restoration opportunities, and discussion of the influence of some upstream actions and processes. This Technical Report can be viewed in conjunction with the McKenzie River Subbasin Summary or as a stand-alone document. The purpose of the technical report is to detail the methodology and findings of the consulting team that the observations and recommendations in the summary document are based on. This part, Part I, provides an introduction to the subbasin and a general overview. Part II details the specific findings of the science team. Part III provides an explanation and examples of how to use the data that has been developed through this assessment to aid in prioritizing restoration activities. Part III also includes the literature cited and appendices.

  9. Criteria for Determination of MC and A System Effectiveness

    International Nuclear Information System (INIS)

    Johnson, Geneva; Long, DeAnn; Albright, Ross; Wright, John

    2008-01-01

    The Nevada Test Site (NTS) is a test bed for implementation of the Safeguards First Principles Initiative (SFPI), a risk-based approach to Material Control and Accountability (MC and A) requirements. The Comprehensive Assessment of Safeguards Strategies (COMPASS) model is used to determine the effectiveness of safeguards systems under SFPI. Under this model, MC and A is divided into nine primary elements. Each element is divided into sub-elements. Then, each sub-element is assigned two values, effectiveness and contribution, that are used to calculate the rating. Effectiveness is a measure of sub-element implementation and how well it meets requirements. Contribution is a relative measure of the importance, and functions as a weighting factor. The COMPASS model provides the methodology for calculation of element and subelement, but not the actual criteria. Each site must develop its own criteria. For the rating to be meaningful, the effectiveness criteria must be objective and based on explicit, measurable criteria. Contribution (weights) must reflect the importance within the MC and A program. This paper details the NTS approach to system effectiveness and contribution values, and will cover the following: the basis for the ratings, an explanation of the contribution weights, and the objective, performance-based effectiveness criteria. Finally, the evaluation process will be described

  10. Environment-based pin-power reconstruction method for homogeneous core calculations

    International Nuclear Information System (INIS)

    Leroyer, H.; Brosselard, C.; Girardi, E.

    2012-01-01

    Core calculation schemes are usually based on a classical two-step approach associated with assembly and core calculations. During the first step, infinite lattice assemblies calculations relying on a fundamental mode approach are used to generate cross-sections libraries for PWRs core calculations. This fundamental mode hypothesis may be questioned when dealing with loading patterns involving several types of assemblies (UOX, MOX), burnable poisons, control rods and burn-up gradients. This paper proposes a calculation method able to take into account the heterogeneous environment of the assemblies when using homogeneous core calculations and an appropriate pin-power reconstruction. This methodology is applied to MOX assemblies, computed within an environment of UOX assemblies. The new environment-based pin-power reconstruction is then used on various clusters of 3x3 assemblies showing burn-up gradients and UOX/MOX interfaces, and compared to reference calculations performed with APOLLO-2. The results show that UOX/MOX interfaces are much better calculated with the environment-based calculation scheme when compared to the usual pin-power reconstruction method. The power peak is always better located and calculated with the environment-based pin-power reconstruction method on every cluster configuration studied. This study shows that taking into account the environment in transport calculations can significantly improve the pin-power reconstruction so far as it is consistent with the core loading pattern. (authors)

  11. McArdle disease with rhabdomyolysis induced by rosuvastatin: case report Doença de McArdle com rabdomiólise induzida por rosuvastatina: relato de caso

    Directory of Open Access Journals (Sweden)

    Paulo José Lorenzoni

    2007-09-01

    Full Text Available The rosuvastatin inducing rhabdomyolysis in McArdle disease (MD has not been reported to date. A 35-years-old man had exercise intolerance, muscular fatigue and cramps during physical activity since infancy. He presented severe rhabdomyolysis episode with seizure and coma after use of rosuvastatin. The investigation showed increased serum creatinekinase levels and the forearm ischemic exercise did not increased venous lactate. The muscle biopsy showed subsarcolemmal and central acummulation of glycogen and absence of the myophosphorylase enzyme. The statin induced myopathy is discussed and the danger of its use in MD is emphasized.Rosuvastatina induzindo rabdomiólise na doença de McArdle (MD não foi relatada até o momento. Descrevemos o caso de um homem de 35 anos que desde a infância apresentava sintomas de intolerância aos exercícios, fadiga muscular e cãibras durante o esforço físico, porém após o uso de rosuvastatina apresentou episódio de rabdomiólise com crises convulsivas e coma. A investigação mostrou creatinoquinase sérica elevada e teste do esforço isquêmico sem aumento no lactato venoso. A biópsia muscular revelou acúmulo central e subsarcolemal de glicogênio nas fibras e ausência da enzima miofosforilase. Discutimos as estatinas induzindo miopatia, enfatizando o risco do seu uso na MD.

  12. Functionality and homogeneity.

    NARCIS (Netherlands)

    2011-01-01

    Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass,

  13. BaBar MC production on the Canadian grid using a web services approach

    Science.gov (United States)

    Agarwal, A.; Armstrong, P.; Desmarais, R.; Gable, I.; Popov, S.; Ramage, S.; Schaffer, S.; Sobie, C.; Sobie, R.; Sulivan, T.; Vanderster, D.; Mateescu, G.; Podaima, W.; Charbonneau, A.; Impey, R.; Viswanathan, M.; Quesnel, D.

    2008-07-01

    The present paper highlights the approach used to design and implement a web services based BaBar Monte Carlo (MC) production grid using Globus Toolkit version 4. The grid integrates the resources of two clusters at the University of Victoria, using the ClassAd mechanism provided by the Condor-G metascheduler. Each cluster uses the Portable Batch System (PBS) as its local resource management system (LRMS). Resource brokering is provided by the Condor matchmaking process, whereby the job and resource attributes are expressed as ClassAds. The important features of the grid are automatic registering of resource ClassAds to the central registry, ClassAds extraction from the registry to the metascheduler for matchmaking, and the incorporation of input/output file staging. Web-based monitoring is employed to track the status of grid resources and the jobs for an efficient operation of the grid. The performance of this new grid for BaBar jobs, and the existing Canadian computational grid (GridX1) based on Globus Toolkit version 2 is found to be consistent.

  14. BaBar MC production on the Canadian grid using a web services approach

    International Nuclear Information System (INIS)

    Agarwal, A; Armstrong, P; Desmarais, R; Gable, I; Popov, S; Ramage, S; Schaffer, S; Sobie, C; Sobie, R; Sulivan, T; Vanderster, D; Mateescu, G; Podaima, W; Charbonneau, A; Impey, R; Viswanathan, M; Quesnel, D

    2008-01-01

    The present paper highlights the approach used to design and implement a web services based BaBar Monte Carlo (MC) production grid using Globus Toolkit version 4. The grid integrates the resources of two clusters at the University of Victoria, using the ClassAd mechanism provided by the Condor-G metascheduler. Each cluster uses the Portable Batch System (PBS) as its local resource management system (LRMS). Resource brokering is provided by the Condor matchmaking process, whereby the job and resource attributes are expressed as ClassAds. The important features of the grid are automatic registering of resource ClassAds to the central registry, ClassAds extraction from the registry to the metascheduler for matchmaking, and the incorporation of input/output file staging. Web-based monitoring is employed to track the status of grid resources and the jobs for an efficient operation of the grid. The performance of this new grid for BaBar jobs, and the existing Canadian computational grid (GridX1) based on Globus Toolkit version 2 is found to be consistent

  15. Looking for New Polycrystalline MC-Reinforced Cobalt-Based Superalloys Candidate to Applications at 1200°C

    OpenAIRE

    Patrice Berthod

    2017-01-01

    For applications for which temperatures higher than 1150°C can be encountered the currently best superalloys, the γ/γ′ single crystals, cannot be used under stress because of the disappearance of their reinforcing γ′ precipitates at such temperatures which are higher than their solvus. Cobalt-based alloys strengthened by refractory and highly stable carbides may represent an alternative solution. In this work the interest was focused on MC carbides of several types. Alloys were elaborated wit...

  16. Evaluation of MC and A detection time

    International Nuclear Information System (INIS)

    Smith, B.W.; Thomas, N.M.

    1984-07-01

    The US Nuclear Regulatory Commission has proposed reform of the material control and accounting (MC and A) requirements for facilities authorized to possess and use formula quantities of strategic special nuclear material (SSNM). The purpose of the reform is to strengthen MC and A capabilities by requiring more timely detection of possible SSNM losses and by providing for more rapid and conclusive resolution of discrepancies. This study was conducted to identify the advantages and disadvantages of detection time intervals ranging from one day to two weeks. Material loss tests based on existing process monitoring data are used to compare the detection sensitiviy, alarm frequency, resolution capability and effort to collect and process data for the stipulated range of detection times. 15 references, 4 figures, 12 tables

  17. Homogenization of resonant chiral metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....

  18. A novel grain cluster-based homogenization scheme

    International Nuclear Information System (INIS)

    Tjahjanto, D D; Eisenlohr, P; Roters, F

    2010-01-01

    An efficient homogenization scheme, termed the relaxed grain cluster (RGC), for elasto-plastic deformations of polycrystals is presented. The scheme is based on a generalization of the grain cluster concept. A volume element consisting of eight (= 2 × 2 × 2) hexahedral grains is considered. The kinematics of the RGC scheme is formulated within a finite deformation framework, where the relaxation of the local deformation gradient of each individual grain is connected to the overall deformation gradient by the, so-called, interface relaxation vectors. The set of relaxation vectors is determined by the minimization of the constitutive energy (or work) density of the overall cluster. An additional energy density associated with the mismatch at the grain boundaries due to relaxations is incorporated as a penalty term into the energy minimization formulation. Effectively, this penalty term represents the kinematical condition of deformation compatibility at the grain boundaries. Simulations have been performed for a dual-phase grain cluster loaded in uniaxial tension. The results of the simulations are presented and discussed in terms of the effective stress–strain response and the overall deformation anisotropy as functions of the penalty energy parameters. In addition, the prediction of the RGC scheme is compared with predictions using other averaging schemes, as well as to the result of direct finite element (FE) simulation. The comparison indicates that the present RGC scheme is able to approximate FE simulation results of relatively fine discretization at about three orders of magnitude lower computational cost

  19. McStas and Mantid integration

    DEFF Research Database (Denmark)

    Nielsen, T. R.; Markvardsen, A. J.; Willendrup, Peter Kjær

    2015-01-01

    McStas and Mantid are two well-established software frameworks within the neutron scattering community. McStas has been primarily used for simulating the neutron transport mechanisms in instruments, while Mantid has been primarily used for data reduction. We report here the status of our work don...

  20. Homogeneity of Gd-based garnet transparent ceramic scintillators for gamma spectroscopy

    Science.gov (United States)

    Seeley, Z. M.; Cherepy, N. J.; Payne, S. A.

    2013-09-01

    Transparent polycrystalline ceramic scintillators based on the composition Gd1.49Y1.49Ce0.02Ga2.2Al2.8O12 are being developed for gamma spectroscopy detectors. Scintillator light yield and energy resolution depend on the details of various processing steps, including powder calcination, green body formation, and sintering atmosphere. We have found that gallium sublimation during vacuum sintering creates compositional gradients in the ceramic and can degrade the energy resolution. While sintering in oxygen produces ceramics with uniform composition and little afterglow, light yields are reduced, compared to vacuum sintering. By controlling the atmosphere during the various process steps, we were able to minimize the gallium sublimation, resulting in a more homogeneous composition and improved gamma spectroscopy performance.

  1. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  2. Non-linear waves in heterogeneous elastic rods via homogenization

    KAUST Repository

    Quezada de Luna, Manuel

    2012-03-01

    We consider the propagation of a planar loop on a heterogeneous elastic rod with a periodic microstructure consisting of two alternating homogeneous regions with different material properties. The analysis is carried out using a second-order homogenization theory based on a multiple scale asymptotic expansion. © 2011 Elsevier Ltd. All rights reserved.

  3. Integrating evidence into practice: use of McKenzie-based treatment for mechanical low back pain

    Directory of Open Access Journals (Sweden)

    Clarke S

    2011-11-01

    Full Text Available Angela Dunsford1, Saravana Kumar1,2, Sarah Clarke1 1International Centre for Allied Health Evidence, 2School of Health Sciences, University of South Australia, Adelaide, South Australia, Australia Abstract: Low back pain (LBP is a major health issue with significant socioeconomic implications in most Western countries. Many forms of treatment have been proposed and investigated in the past, with exercise being a commonly prescribed intervention. Within allied health, in particular physiotherapy, there has been a growing movement that recognizes the role of the McKenzie method in treating LBP. Within the McKenzie framework, directional preference (DP exercises are one such intervention, with preliminary data demonstrating its effectiveness in the management of LBP. In this paper, we aim to integrate the evidence from current research, identified using a systematic review, and utilize a practical real-life case scenario to outline how evidence from the literature can be implemented in clinical practice. The findings from the systematic review indicate that DP exercises may have positive effects in the management of LBP. While the body of evidence to support this is limited (only four studies and therefore modest at best, it does provide some emerging evidence to support the use of DP exercises in clinical practice. Despite this, gaps also persist in the literature on DP exercises, and this relates to the exercise parameters and the compliance rates. Recognizing this dichotomy (modest evidence in some areas and evidence gaps in other areas, which is likely to confront health practitioners, using a practical approach with a real-life clinical scenario, we outline how the evidence from the systematic review can be implemented in clinical practice. This approach builds on the philosophy of evidence-based practice of integrating research evidence with clinical expertise and patient values. Keywords: low back pain, McKenzie method, directional

  4. The MC4 receptor and control of appetite

    NARCIS (Netherlands)

    Adan, R. A. H.; Tiesjema, B.; Hillebrand, J. J. G.; La Fleur, S. E.; Kas, M. J. H.; de Krom, M.

    2006-01-01

    Mutations in the human melanocortin (MC)4 receptor have been associated with obesity, which underscores the relevance of this receptor as a drug target to treat obesity. Infusion of MC4R agonists decreases food intake, whereas inhibition of MC receptor activity by infusion of an MC receptor

  5. A simple MC-based algorithm for evaluating reliability of stochastic-flow network with unreliable nodes

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2004-01-01

    A MP/minimal cutset (MC) is a path/cut set such that if any edge is removed from this path/cut set, then the remaining set is no longer a path/cut set. An intuitive method is proposed to evaluate the reliability in terms of MCs in a stochastic-flow network subject to both edge and node failures under the condition that all of the MCs are given in advance. This is an extension of the best of known algorithms for solving the d-MC (a special MC but formatted in a system-state vector, where d is the lower bound points of the system capacity level) problem from the stochastic-flow network without unreliable nodes to with unreliable nodes by introducing some simple concepts. These concepts were first developed in the literature to implement the proposed algorithm to reduce the number of d-MC candidates. This method is more efficient than the best of known existing algorithms regardless if the network has or does not have unreliable nodes. Two examples are illustrated to show how the reliability is determined using the proposed algorithm in the network with or without unreliable nodes. The computational complexity of the proposed algorithm is analyzed and compared with the existing methods

  6. Barbara McClintock, Jumping Genes, and Transposition

    Science.gov (United States)

    McClintock Honored * Woman of Science * Educational Material * Resources with Additional Information Barbara McClintock's remarkable life spanned the history of genetics in the twentieth century. ... [T]he science of Dedicate Famous Scientist Stamps ... Woman of Science: McClintock, Barbara and the Jumping Genes, 4,000

  7. A homogeneous fluorometric assay platform based on novel synthetic proteins

    International Nuclear Information System (INIS)

    Vardar-Schara, Goenuel; Krab, Ivo M.; Yi, Guohua; Su, Wei Wen

    2007-01-01

    Novel synthetic recombinant sensor proteins have been created to detect analytes in solution, in a rapid single-step 'mix and read' noncompetitive homogeneous assay process, based on modulating the Foerster resonance energy transfer (FRET) property of the sensor proteins upon binding to their targets. The sensor proteins comprise a protein scaffold that incorporates a specific target-capturing element, sandwiched by genetic fusion between two molecules that form a FRET pair. The utility of the sensor proteins was demonstrated via three examples, for detecting an anti-biotin Fab antibody, a His-tagged recombinant protein, and an anti-FLAG peptide antibody, respectively, all done directly in solution. The diversity of sensor-target interactions that we have demonstrated in this study points to a potentially universal applicability of the biosensing concept. The possibilities for integrating a variety of target-capturing elements with a common sensor scaffold predict a broad range of practical applications

  8. Geophysical expression of caldera related volcanism, structures and mineralization in the McDermitt volcanic field

    Science.gov (United States)

    Rytuba, J. J.; Blakely, R. J.; Moring, B.; Miller, R.

    2013-12-01

    The High Rock, Lake Owyhee, and McDermitt volcanic fields, consisting of regionally extensive ash flow tuffs and associated calderas, developed in NW Nevada and SE Oregon following eruption of the ca. 16.7 Ma Steens flood basalt. The first ash flow, the Tuff of Oregon Canyon, erupted from the McDermitt volcanic field at 16.5Ma. It is chemically zoned from peralkaline rhyolite to dacite with trace element ratios that distinguish it from other ash flow tuffs. The source caldera, based on tuff distribution, thickness, and size of lithic fragments, is in the area in which the McDermitt caldera (16.3 Ma) subsequently formed. Gravity and magnetic anomalies are associated with some but not all of the calderas. The White Horse caldera (15.6 Ma), the youngest caldera in the McDermitt volcanic field has the best geophysical expression, with both aeromagnetic and gravity lows coinciding with the caldera. Detailed aeromagnetic and gravity surveys of the McDermitt caldera, combined with geology and radiometric surveys, provides insight into the complexities of caldera collapse, resurgence, post collapse volcanism, and hydrothermal mineralization. The McDermitt caldera is among the most mineralized calderas in the world, whereas other calderas in these three Mid Miocene volcanic fields do not contain important hydrothermal ore deposits, despite having similar age and chemistry. The McDermitt caldera is host to Hg, U, and Li deposits and potentially significant resources of Ga, Sb, and REE. The geophysical data indicate that post-caldera collapse intrusions were important in formation of the hydrothermal systems. An aeromagnetic low along the E caldera margin reflects an intrusion at a depth of 2 km associated with the near-surface McDermitt-hot-spring-type Hg-Sb deposit, and the deeper level, high-sulfidation Ga-REE occurrence. The Li deposits on the W side of the caldera are associated with a series of low amplitude, small diameter aeromagnetic anomalies that form a continuous

  9. The McDonaldization of Nigerian Universities

    Directory of Open Access Journals (Sweden)

    Emeka W. Dumbili

    2014-04-01

    Full Text Available This article examines the extent to which the deregulation of Nigerian higher education (HE has facilitated the McDonaldization of the universities. University education in Nigeria commenced in 1948 with the establishment of the University College, Ibadan. After independence in 1960, subsequent governments expanded the number of universities, a policy based on a lack of quality manpower in leadership positions created by the exit of British officials and the need to grant access to an increasing number of prospective students. In the 1970s, the number of universities increased accompanied by a decline in infrastructure, funding, and working conditions. This resulted in several strikes and an exodus of academics to other countries. Instead of tackling the problems, the federal government shifted responsibilities by approving private ownership of universities in 1999 and by establishing the National Open University of Nigeria (NOUN in 2001. Against this backdrop, this article critically analyzes how some of these reforms facilitated the McDonaldization of Nigerian universities. The article reveals how this has resulted in an overloading of responsibilities on the faculty, erosion of academic autonomy, a prioritization of quantity over quality of publications, and an assumption of “customer” status by students. The article uses evidence from McDonaldized HE in Western countries to discuss the implications of these developments and suggests some remedial measures.

  10. ExMC Technology Watch

    Science.gov (United States)

    Krihak, M.; Barr, Y.; Watkins, S.; Fung, P.; McGrath, T.; Baumann, D.

    2012-01-01

    The Technology Watch (Tech Watch) project is a NASA endeavor conducted under the Human Research Program's (HRP) Exploration Medical Capability (ExMC) element, and focusing on ExMC technology gaps. The project involves several NASA centers, including the Johnson Space Center (JSC), Glenn Research Center (GRC), Ames Research Center (ARC), and the Langley Research Center (LaRC). The objective of Tech Watch is to identify emerging, high-impact technologies that augment current NASA HRP technology development efforts. Identifying such technologies accelerates the development of medical care and research capabilities for the mitigation of potential health issues encountered during human space exploration missions. The aim of this process is to leverage technologies developed by academia, industry and other government agencies and to identify the effective utilization of NASA resources to maximize the HRP return on investment. The establishment of collaborations with these entities is beneficial to technology development, assessment and/or insertion and further NASA's goal to provide a safe and healthy environment for human exploration. In 2011, the major focus areas for Tech Watch included information dissemination, education outreach and public accessibility to technology gaps and gap reports. The dissemination of information was accomplished through site visits to research laboratories and/or companies, and participation at select conferences where Tech Watch objectives and technology gaps were presented. Presentation of such material provided researchers with insights on NASA ExMC needs for space exploration and an opportunity to discuss potential areas of common interest. The second focus area, education outreach, was accomplished via two mechanisms. First, several senior student projects, each related to an ExMC technology gap, were sponsored by the various NASA centers. These projects presented ExMC related technology problems firsthand to collegiate laboratories

  11. Ilusões de modernidade: o fetiche da marca McDonald's no Brasil Illusions of modernity: the fetish of McDonald's' brand in Brazil

    Directory of Open Access Journals (Sweden)

    Isleide Arruda Fontenelle

    2006-08-01

    Full Text Available Objetiva-se apresentar e discutir as relações atuais entre imagem e entretenimento a partir de pesquisas realizadas sobre a construção da imagem de marca McDonald's e sobre as modernas técnicas de marketing. Visando compreender porque nos tornamos consumidores de imagens, procurou-se recuperar, a partir da própria historia do McDonald's, os acontecimentos econômicos, sociais, culturais, políticos, que teriam nos transformado em uma sociedade na qual "estar na imagem é existir". Embora trágica em seu sentido de fundo, essa perda da forma nos é compensada por imagens de diversão e felicidade que as marcas nos transmitem. Ao final, questiona-se o alcance global dessa promessa a partir de uma digressão sobre o Brasil: como a marca McDonald's nos fornece as imagens para uma certa constituição identitária; e o seu nome para um sentimento de permanência? Como falar de "identificação" com uma marca que, aparentemente, não teria uma relação histórica e cultural com o Brasil?This article presents and discusses the current relationship between image and entertainment, based on the results of research studies on the constitution of McDonald's' brand image all over the world and contemporary marketing. Aiming to understand why we have become image consumers, those studies tried to recover, based on McDonald's history, economical, social, cultural, and political events that have lead us towards an obsessed image society, in which, "being in the image is the same as existing". Although tragic in its bottom line, that inexistence of form is compensated to us by amusement and happiness images conveyed by the brands. Finally, the research questions the global reach of that promise, starting from a digression on Brazil: how does McDonald's' brand supply us images for a certain identity constitution; and its brand name for a permanence feeling? How is it possible to speak of "identification" with a brand that, seemingly, would not have a

  12. Subcarrier Group Assignment for MC-CDMA Wireless Networks

    Directory of Open Access Journals (Sweden)

    Le-Ngoc Tho

    2007-01-01

    Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.

  13. Subcarrier Group Assignment for MC-CDMA Wireless Networks

    Directory of Open Access Journals (Sweden)

    Tho Le-Ngoc

    2007-12-01

    Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.

  14. A new approach to the d-MC problem

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2002-01-01

    Many real-world systems are multi-state systems composed of multi-state components in which the reliability can be computed in terms of the lower bound points of level d, called d-Mincuts (d-MCs). Such systems (electric power, transportation, etc.) may be regarded as flow networks whose arcs have independent, discrete, limited and multi-valued random capacities. In this paper, all MCs are assumed to be known in advance, and we focused on how to verify each d-MC candidate before using d-MCs to calculate the network reliability. The proposed algorithm is more efficient than existing algorithms. The algorithm runs in O(pσmn) time, a significant improvement over the previous O(pσm 2 ) time bounds based on max-flow/min-cut, where p and σ are the number of MCs and d-MC candidates, respectively. It is simple, intuitive and uses no complex data structures. An example is given to show how all d-MC candidates are found and verified by the proposed algorithm. Then the reliability of this example is computed

  15. Colloidal gold-McAb probe-based rapid immunoassay strip for simultaneous detection of fumonisins in maize.

    Science.gov (United States)

    Yao, Jingjing; Sun, Yaning; Li, Qingmei; Wang, Fangyu; Teng, Man; Yang, Yanyan; Deng, Ruiguang; Hu, Xiaofei

    2017-05-01

    Fumonisins are a kind of toxic and carcinogenic mycotoxin. A rapid immunochromatographic test strip has been developed for simultaneous detection of fumonisin B 1 , B 2 and B 3 (FB 1 , FB 2 and FB 3 ) in maize based on colloidal gold-labelled monoclonal antibody (McAb) against FB 1 probe. The anti-FB 1 McAb (2E11-H3) was produced through immunisation and cell fusion, and identified as high affinity, specificity and sensitivity. The cross-reaction ratios with fumonisin B 2 and B 3 were accordingly 385% and 72.4%, while none with other analogues. The colloid gold-labelled anti-FB 1 McAb probe was successfully prepared and used for establishing the immunochromatographic strip. The test strip showed high sensitivity and specificity, the IC 50 for FB 1 was 58.08 ng mL -1 , LOD was 11.24 ng mL -1 , calculated from standard curve. Moreover, the test strip exhibited high cross-reactivity with FB 2 and FB 3 , and could be applied to the simultaneous detection of FBs (FB 1 :FB 2 :FB 3 = 12:4:1) in maize sample with high accuracy and precision. The average recoveries of FBs in maize ranged from 90.42% to 95.29%, and CVs were 1.25-3.77%. The results of the test strip for FBs samples showed good correlation with high-performance liquid chromatography analysis. The immunochromatographic test strip could be employed in the rapid simultaneous detection of FB 1 , FB 2 and FB 3 in maize samples on-site. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  16. High-precision isotopic characterization of USGS reference materials by TIMS and MC-ICP-MS

    Science.gov (United States)

    Weis, Dominique; Kieffer, Bruno; Maerschalk, Claude; Barling, Jane; de Jong, Jeroen; Williams, Gwen A.; Hanano, Diane; Pretorius, Wilma; Mattielli, Nadine; Scoates, James S.; Goolaerts, Arnaud; Friedman, Richard M.; Mahoney, J. Brian

    2006-08-01

    The Pacific Centre for Isotopic and Geochemical Research (PCIGR) at the University of British Columbia has undertaken a systematic analysis of the isotopic (Sr, Nd, and Pb) compositions and concentrations of a broad compositional range of U.S. Geological Survey (USGS) reference materials, including basalt (BCR-1, 2; BHVO-1, 2), andesite (AGV-1, 2), rhyolite (RGM-1, 2), syenite (STM-1, 2), granodiorite (GSP-2), and granite (G-2, 3). USGS rock reference materials are geochemically well characterized, but there is neither a systematic methodology nor a database for radiogenic isotopic compositions, even for the widely used BCR-1. This investigation represents the first comprehensive, systematic analysis of the isotopic composition and concentration of USGS reference materials and provides an important database for the isotopic community. In addition, the range of equipment at the PCIGR, including a Nu Instruments Plasma MC-ICP-MS, a Thermo Finnigan Triton TIMS, and a Thermo Finnigan Element2 HR-ICP-MS, permits an assessment and comparison of the precision and accuracy of isotopic analyses determined by both the TIMS and MC-ICP-MS methods (e.g., Nd isotopic compositions). For each of the reference materials, 5 to 10 complete replicate analyses provide coherent isotopic results, all with external precision below 30 ppm (2 SD) for Sr and Nd isotopic compositions (27 and 24 ppm for TIMS and MC-ICP-MS, respectively). Our results also show that the first- and second-generation USGS reference materials have homogeneous Sr and Nd isotopic compositions. Nd isotopic compositions by MC-ICP-MS and TIMS agree to within 15 ppm for all reference materials. Interlaboratory MC-ICP-MS comparisons show excellent agreement for Pb isotopic compositions; however, the reproducibility is not as good as for Sr and Nd. A careful, sequential leaching experiment of three first- and second-generation reference materials (BCR, BHVO, AGV) indicates that the heterogeneity in Pb isotopic compositions

  17. MR-based field-of-view extension in MR/PET: B0 homogenization using gradient enhancement (HUGE).

    Science.gov (United States)

    Blumhagen, Jan O; Ladebeck, Ralf; Fenchel, Matthias; Scheffler, Klaus

    2013-10-01

    In whole-body MR/PET, the human attenuation correction can be based on the MR data. However, an MR-based field-of-view (FoV) is limited due to physical restrictions such as B0 inhomogeneities and gradient nonlinearities. Therefore, for large patients, the MR image and the attenuation map might be truncated and the attenuation correction might be biased. The aim of this work is to explore extending the MR FoV through B0 homogenization using gradient enhancement in which an optimal readout gradient field is determined to locally compensate B0 inhomogeneities and gradient nonlinearities. A spin-echo-based sequence was developed that computes an optimal gradient for certain regions of interest, for example, the patient's arms. A significant distortion reduction was achieved outside the normal MR-based FoV. This FoV extension was achieved without any hardware modifications. In-plane distortions in a transaxially extended FoV of up to 600 mm were analyzed in phantom studies. In vivo measurements of the patient's arms lying outside the normal specified FoV were compared with and without the use of B0 homogenization using gradient enhancement. In summary, we designed a sequence that provides data for reducing the image distortions due to B0 inhomogeneities and gradient nonlinearities and used the data to extend the MR FoV. Copyright © 2011 Wiley Periodicals, Inc.

  18. Towards a methanol economy based on homogeneous catalysis: methanol to H2 and CO2 to methanol

    DEFF Research Database (Denmark)

    Alberico, E.; Nielsen, Martin

    2015-01-01

    The possibility to implement both the exhaustive dehydrogenation of aqueous methanol to hydrogen and CO2 and the reverse reaction, the hydrogenation of CO2 to methanol and water, may pave the way to a methanol based economy as part of a promising renewable energy system. Recently, homogeneous...

  19. MC++: A parallel, portable, Monte Carlo neutron transport code in C++

    International Nuclear Information System (INIS)

    Lee, S.R.; Cummings, J.C.; Nolen, S.D.

    1997-01-01

    MC++ is an implicit multi-group Monte Carlo neutron transport code written in C++ and based on the Parallel Object-Oriented Methods and Applications (POOMA) class library. MC++ runs in parallel on and is portable to a wide variety of platforms, including MPPs, SMPs, and clusters of UNIX workstations. MC++ is being developed to provide transport capabilities to the Accelerated Strategic Computing Initiative (ASCI). It is also intended to form the basis of the first transport physics framework (TPF), which is a C++ class library containing appropriate abstractions, objects, and methods for the particle transport problem. The transport problem is briefly described, as well as the current status and algorithms in MC++ for solving the transport equation. The alpha version of the POOMA class library is also discussed, along with the implementation of the transport solution algorithms using POOMA. Finally, a simple test problem is defined and performance and physics results from this problem are discussed on a variety of platforms

  20. Homogeneous group, research, institution

    Directory of Open Access Journals (Sweden)

    Francesca Natascia Vasta

    2014-09-01

    Full Text Available The work outlines the complex connection among empiric research, therapeutic programs and host institution. It is considered the current research state in Italy. Italian research field is analyzed and critic data are outlined: lack of results regarding both the therapeutic processes and the effectiveness of eating disorders group analytic treatment. The work investigates on an eating disorders homogeneous group, led into an eating disorder outpatient service. First we present the methodological steps the research is based on including the strong connection among theory and clinical tools. Secondly clinical tools are described and the results commented. Finally, our results suggest the necessity of validating some more specifical hypothesis: verifying the relationship between clinical improvement (sense of exclusion and painful emotions reduction and specific group therapeutic processes; verifying the relationship between depressive feelings, relapses and transition trough a more differentiated groupal field.Keywords: Homogeneous group; Eating disorders; Institutional field; Therapeutic outcome

  1. Dynamic Resource Management in MC-CDMA Based Cellular Wireless Networks

    Directory of Open Access Journals (Sweden)

    Bala Jeevitha Vani

    2009-10-01

    Full Text Available Most of the multimedia and Internet services today are asymmetric in nature, and require high data rate support. Allocating equal band width in both uplink and downlink is not prudent solution, as most of the time user requirement is more either in uplink or downlink. The Multi Carrier Code Division Multiple Access (MC-CDMA system with time division duplex mode can easily met this requirement by dynamically declaring traffic direction in TDD slot, and adaptively allocating the sub channels. In this paper, we propose a adaptive slot and sub carrier allocation algorithm, that can be independently implemented in each cell of mobile communication network. Our analytical model is generalization of two cell concept to represent a multi cell model. Based on two cell concept four cases of interference pattern has been considered and simulated separately in presence of Additive White Gaussian Noise (AWGN and Rayleigh Channel. The simulated result suggests the requirement of approximately 9dB of Signal to Noise Ratio (SNR to maintain Bit Error Rate below 10-3. We also analyze the average delay incurred by the proposed algorithm in allocating resources.

  2. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines

    Directory of Open Access Journals (Sweden)

    Jingjing Xu

    2015-08-01

    Full Text Available In this paper, a wireless sensor network (WSN technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD algorithm with particle swarm optimization (PSO, namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization.

  3. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines.

    Science.gov (United States)

    Xu, Jingjing; Yang, Wei; Zhang, Linyuan; Han, Ruisong; Shao, Xiaotao

    2015-08-27

    In this paper, a wireless sensor network (WSN) technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA) to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI) arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD) algorithm with particle swarm optimization (PSO), namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER) performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization.

  4. Homogeneous crystal nucleation in polymers.

    Science.gov (United States)

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  5. Investigation to biodiesel production by the two-step homogeneous base-catalyzed transesterification.

    Science.gov (United States)

    Ye, Jianchu; Tu, Song; Sha, Yong

    2010-10-01

    For the two-step transesterification biodiesel production made from the sunflower oil, based on the kinetics model of the homogeneous base-catalyzed transesterification and the liquid-liquid phase equilibrium of the transesterification product, the total methanol/oil mole ratio, the total reaction time, and the split ratios of methanol and reaction time between the two reactors in the stage of the two-step reaction are determined quantitatively. In consideration of the transesterification intermediate product, both the traditional distillation separation process and the improved separation process of the two-step reaction product are investigated in detail by means of the rigorous process simulation. In comparison with the traditional distillation process, the improved separation process of the two-step reaction product has distinct advantage in the energy duty and equipment requirement due to replacement of the costly methanol-biodiesel distillation column. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. McDonaldization, Islamic teachings, and funerary practices in Kuwait.

    Science.gov (United States)

    Iqbal, Zafar

    2011-01-01

    Drawing on George Ritzer's sociological concept of McDonaldization, this article explores the transformation of burial practices in Kuwait. It is argued that traditional, religious, and private ways of dealing with death have been modernized using the fast-food model of McDonald's. This article examines Islamic teachings on burial and how that model has been applied to the traditional Muslim funerary services, including cemetery management, grave excavation, funeral prayers, burial, and condolences, to make them more efficient vis-a-vis more profitable. Based on personal observations and random interviews, the study finds that the state bureaucracy in Kuwait has made burial rituals more efficient, standardized, calculable, and controlled. Furthermore, several associated irrationalities are also considered. Findings suggest that some individuals may not be happy with these changes but there is no popular resistance to McDonaldization of the burial practices, probably due to the authoritarian and welfare nature of the State of Kuwait.

  7. Homogenization of High-Contrast Brinkman Flows

    KAUST Repository

    Brown, Donald L.

    2015-04-16

    Modeling porous flow in complex media is a challenging problem. Not only is the problem inherently multiscale but, due to high contrast in permeability values, flow velocities may differ greatly throughout the medium. To avoid complicated interface conditions, the Brinkman model is often used for such flows [O. Iliev, R. Lazarov, and J. Willems, Multiscale Model. Simul., 9 (2011), pp. 1350--1372]. Instead of permeability variations and contrast being contained in the geometric media structure, this information is contained in a highly varying and high-contrast coefficient. In this work, we present two main contributions. First, we develop a novel homogenization procedure for the high-contrast Brinkman equations by constructing correctors and carefully estimating the residuals. Understanding the relationship between scales and contrast values is critical to obtaining useful estimates. Therefore, standard convergence-based homogenization techniques [G. A. Chechkin, A. L. Piatniski, and A. S. Shamev, Homogenization: Methods and Applications, Transl. Math. Monogr. 234, American Mathematical Society, Providence, RI, 2007, G. Allaire, SIAM J. Math. Anal., 23 (1992), pp. 1482--1518], although a powerful tool, are not applicable here. Our second point is that the Brinkman equations, in certain scaling regimes, are invariant under homogenization. Unlike in the case of Stokes-to-Darcy homogenization [D. Brown, P. Popov, and Y. Efendiev, GEM Int. J. Geomath., 2 (2011), pp. 281--305, E. Marusic-Paloka and A. Mikelic, Boll. Un. Mat. Ital. A (7), 10 (1996), pp. 661--671], the results presented here under certain velocity regimes yield a Brinkman-to-Brinkman upscaling that allows using a single software platform to compute on both microscales and macroscales. In this paper, we discuss the homogenized Brinkman equations. We derive auxiliary cell problems to build correctors and calculate effective coefficients for certain velocity regimes. Due to the boundary effects, we construct

  8. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  9. Control rod homogenization in heterogeneous sodium-cooled fast reactors

    International Nuclear Information System (INIS)

    Andersson, Mikael

    2016-01-01

    The sodium-cooled fast reactor is one of the candidates for a sustainable nuclear reactor system. In particular, the French ASTRID project employs an axially heterogeneous design, proposed in the so-called CFV (low sodium effect) core, to enhance the inherent safety features of the reactor. This thesis focuses on the accurate modeling of the control rods, through the homogenization method. The control rods in a sodium-cooled fast reactor are used for reactivity compensation during the cycle, power shaping, and to shutdown the reactor. In previous control rod homogenization procedures, only a radial description of the geometry was implemented, hence the axially heterogeneous features of the CFV core could not be taken into account. This thesis investigates the different axial variations the control rod experiences in a CFV core, to determine the impact that these axial environments have on the control rod modeling. The methodology used in this work is based on previous homogenization procedures, the so-called equivalence procedure. The procedure was newly implemented in the PARIS code system in order to be able to use 3D geometries, and thereby be take axial effects into account. The thesis is divided into three parts. The first part investigates the impact of different neutron spectra on the homogeneous control-rod cross sections. The second part investigates the cases where the traditional radial control-rod homogenization procedure is no longer applicable in the CFV core, which was found to be 5-10 cm away from any material interface. In the third part, based on the results from the second part, a 3D model of the control rod is used to calculate homogenized control-rod cross sections. In a full core model, a study is made to investigate the impact these axial effects have on control rod-related core parameters, such as the control rod worth, the capture rates in the control rod, and the power in the adjacent fuel assemblies. All results were compared to a Monte

  10. Asymptotic stability of spectral-based PDF modeling for homogeneous turbulent flows

    Science.gov (United States)

    Campos, Alejandro; Duraisamy, Karthik; Iaccarino, Gianluca

    2015-11-01

    Engineering models of turbulence, based on one-point statistics, neglect spectral information inherent in a turbulence field. It is well known, however, that the evolution of turbulence is dictated by a complex interplay between the spectral modes of velocity. For example, for homogeneous turbulence, the pressure-rate-of-strain depends on the integrated energy spectrum weighted by components of the wave vectors. The Interacting Particle Representation Model (IPRM) (Kassinos & Reynolds, 1996) and the Velocity/Wave-Vector PDF model (Van Slooten & Pope, 1997) emulate spectral information in an attempt to improve the modeling of turbulence. We investigate the evolution and asymptotic stability of the IPRM using three different approaches. The first approach considers the Lagrangian evolution of individual realizations (idealized as particles) of the stochastic process defined by the IPRM. The second solves Lagrangian evolution equations for clusters of realizations conditional on a given wave vector. The third evolves the solution of the Eulerian conditional PDF corresponding to the aforementioned clusters. This last method avoids issues related to discrete particle noise and slow convergence associated with Lagrangian particle-based simulations.

  11. Workshop for development of formal MC and A plans

    International Nuclear Information System (INIS)

    Erkkila, B.H.; Hatcher, C.R.; Scott, S.C.; Thomas, K.E.

    1998-01-01

    Upgrades to both physical protection and material controls and accountability (MC and A) are progressing at many nuclear facilities in the Russian Federation. In general, Russian facilities are well prepared to address issues related to physical protection. The infrastructure to plan and implement physical protection upgrades is already in place in Russia. The infrastructure to integrate new and existing MC and A capabilities is not as well developed. The authors experience has shown that working with Russian facility management and technical personnel to draft an MC and A plan provides a way of moving MC and A upgrades forward. Los Alamos has developed a workshop for Russian nuclear facilities to facilitate the preparation of their facility MC and A plans. The workshops have been successful in bringing together facility management, safeguards specialists, and operations personnel to initiate the process of drafting these MC and A plans. The MC and A plans provide the technical basis for scheduling future MC and A upgrades at the facilities. Although facility MC and A plans are site specific, the workshop can be tailored to guide the development of an MC and A plan for any Russian nuclear site

  12. Feasibility Study of Aseptic Homogenization: Affecting Homogenization Steps on Quality of Sterilized Coconut Milk

    Directory of Open Access Journals (Sweden)

    Phungamngoen Chanthima

    2016-01-01

    Full Text Available Coconut milk is one of the most important protein-rich food sources available today. Separation of an emulsion into an aqueous phase and cream phase is commonly occurred and this leads an unacceptably physical defect of either fresh or processed coconut milk. Since homogenization steps are known to affect the stability of coconut milk. This work was aimed to study the effect of homogenization steps on quality of coconut milk. The samples were subject to high speed homogenization in the range of 5000-15000 rpm under sterilize temperatures at 120-140 °C for 15 min. The result showed that emulsion stability increase with increasing speed of homogenization. The lower fat particles were generated and easy to disperse in continuous phase lead to high stability. On the other hand, the stability of coconut milk decreased, fat globule increased, L value decreased and b value increased when the high sterilization temperature was applied. Homogenization after heating led to higher stability than homogenization before heating due to the reduced particle size of coconut milk after aggregation during sterilization process. The results implied that homogenization after sterilization process might play an important role on the quality of the sterilized coconut milk.

  13. Moderate voluntary exercise attenuates the metabolic syndrome in melanocortin-4 receptor-deficient rats showing central dopaminergic dysregulation

    Directory of Open Access Journals (Sweden)

    Silvana Obici

    2015-10-01

    Conclusions: Central dopamine dysregulation during VWR reinforces the link between MC4R function and molecular and behavioral responding to rewards. The data also suggest that exercise can be a successful lifestyle intervention in MC4R-haploinsufficient individuals despite reduced positive reinforcement during exercise training.

  14. Generation and stabilization of whey-based monodisperse naoemulsions using ultra-high pressure homogenization and small amphipathic co-emulsifier combinations

    Science.gov (United States)

    Ultra-high-pressure homogenization (UHPH) was used to generate monodisperse stable peanut oil nanoemulsions within a desired nanosize range (whey protein concentrate (WPC), sodium dodecyl sulfate, Triton X-100 (X100), and zwitterionic sulfobetaine-base...

  15. 7 CFR 58.636 - Homogenization.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Homogenization. 58.636 Section 58.636 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.636 Homogenization. Homogenization of the pasteurized mix shall be accomplished to...

  16. McDonald’s Corporation - 2015 (MCD)

    OpenAIRE

    Alen Badal

    2017-01-01

    McDonald’s Corporation, 2015 is aiming to enlighten the “Experience of the Future” for consumers, with a special focus on the ‘younger’ generation. Beginning in 2015 and moving forward, McDonald’s has operationalized the functions of its strategy to bett er serve consumers with such offerings as trial-testing a build-your-burger strategy with the order being served at the table, known as the “Create Your Taste” program. The restaurant chain has introduced the all-day breakfast menu and ‘McPic...

  17. Development of a Practical Hydrogen Storage System Based on Liquid Organic Hydrogen Carriers and a Homogeneous Catalyst

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Craig [Hawaii Hydrogen Carriers, LLC, Honolulu, HI (United States); Brayton, Daniel [Hawaii Hydrogen Carriers, LLC, Honolulu, HI (United States); Jorgensen, Scott W. [General Motors, LLC, Warren, MI (United States). Research and Development Center. Chemical and Material Systems Lab.; Hou, Peter [General Motors, LLC, Warren, MI (United States). Research and Development Center. Chemical and Material Systems Lab.

    2017-03-24

    The objectives of this project were: 1) optimize a hydrogen storage media based on LOC/homogeneous pincer catalyst (carried out at Hawaii Hydrogen Carriers, LLC) and 2) develop space, mass and energy efficient tank and reactor system to house and release hydrogen from the media (carried out at General Motor Research Center).

  18. Deployment-based lifetime optimization model for homogeneous Wireless Sensor Network under retransmission.

    Science.gov (United States)

    Li, Ruiying; Liu, Xiaoxi; Xie, Wei; Huang, Ning

    2014-12-10

    Sensor-deployment-based lifetime optimization is one of the most effective methods used to prolong the lifetime of Wireless Sensor Network (WSN) by reducing the distance-sensitive energy consumption. In this paper, data retransmission, a major consumption factor that is usually neglected in the previous work, is considered. For a homogeneous WSN, monitoring a circular target area with a centered base station, a sensor deployment model based on regular hexagonal grids is analyzed. To maximize the WSN lifetime, optimization models for both uniform and non-uniform deployment schemes are proposed by constraining on coverage, connectivity and success transmission rate. Based on the data transmission analysis in a data gathering cycle, the WSN lifetime in the model can be obtained through quantifying the energy consumption at each sensor location. The results of case studies show that it is meaningful to consider data retransmission in the lifetime optimization. In particular, our investigations indicate that, with the same lifetime requirement, the number of sensors needed in a non-uniform topology is much less than that in a uniform one. Finally, compared with a random scheme, simulation results further verify the advantage of our deployment model.

  19. Adolescent Purchasing Behavior at McDonald's and Subway.

    Science.gov (United States)

    Lesser, Lenard I; Kayekjian, Karen C; Velasquez, Paz; Tseng, Chi-Hong; Brook, Robert H; Cohen, Deborah A

    2013-10-01

    To assess whether adolescents purchasing food at a restaurant marketed as "healthy" (Subway) purchase fewer calories than at a competing chain (McDonald's). We studied 97 adolescents who purchased a meal at both restaurants on different days, using each participant as his or her control. We compared the difference in calories purchased by adolescents at McDonald's and Subway in a diverse area of Los Angeles, CA. Adolescents purchased an average of 1,038 calories (standard error of the mean [SEM]: 41) at McDonald's and 955 calories (SEM 39) at Subway. The difference of 83 calories (95% confidence interval [CI]: -20 to 186) was not statistically significant (p = .11). At McDonald's, participants purchased significantly more calories from drinks (151 vs. 61, p McDonald's vs. 35 at Subway, p McDonald's (.15 vs. .57 cups, p McDonald's. Although Subway meals had more vegetables, meals from both restaurants are likely to contribute to overeating. Copyright © 2013 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  20. Lattice Boltzmann model for three-dimensional decaying homogeneous isotropic turbulence

    International Nuclear Information System (INIS)

    Xu Hui; Tao Wenquan; Zhang Yan

    2009-01-01

    We implement a lattice Boltzmann method (LBM) for decaying homogeneous isotropic turbulence based on an analogous Galerkin filter and focus on the fundamental statistical isotropic property. This regularized method is constructed based on orthogonal Hermite polynomial space. For decaying homogeneous isotropic turbulence, this regularized method can simulate the isotropic property very well. Numerical studies demonstrate that the novel regularized LBM is a promising approximation of turbulent fluid flows, which paves the way for coupling various turbulent models with LBM

  1. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  2. Value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations

    Directory of Open Access Journals (Sweden)

    Luo Li-Qin

    2016-01-01

    Full Text Available In this paper, we investigate the value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations, and obtain the results on the relations between the order of the solutions and the convergence exponents of the zeros, poles, a-points and small function value points of the solutions, which show the relations in the case of non-homogeneous equations are sharper than the ones in the case of homogeneous equations.

  3. Extending the Kawai-Kerman-McVoy Statistical Theory of Nuclear Reactions to Intermediate Structure via Doorways

    International Nuclear Information System (INIS)

    Arbanas, Goran; Bertulani, C.A.; Dean, D.J.; Kerman, A.K.; Roche, K.J.

    2011-01-01

    Kawai, Kerman, and McVoy have shown that a statistical treatment of many open channels that are coupled by direct reactions leads to modifications of the Hauser- Feshbach expression for energy-averaged cross section (Ann. of Phys. 75 (1973) 156). The energy averaging interval for this cross section is on the order of the width of single particle resonances, 1MeV, revealing only a gross structure in the cross section. When the energy-averaging interval is decreased down to a width of a doorway state 0.1 MeV, a so-called intermediate structure may be observed in cross sections. We extend the Kawai-Kerman-McVoy theory into the intermediate structure by leveraging a theory of doorway states developed by Feshbach, Kerman, and Lemmer (Ann. of Phys. 42 (1967) 230). As a byproduct of the extension, an alternative derivation of the central result of the Kawai-Kerman-McVoy theory is suggested. We quantify the effect of the approximations used in derivation by performing numerical computations for a large set of compound nuclear states.

  4. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhijie, E-mail: zhijie.xu@pnnl.gov [Computational Mathematics Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Joshi, Vineet [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Hu, Shenyang [Reactor Materials & Mechanical Design, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Paxton, Dean [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lavender, Curt [Energy Processes & Materials Division, Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Burkes, Douglas [Nuclear Engineering and Analysis Group, Pacific Northwest National Laboratory, Richland, WA 99352 (United States)

    2016-04-01

    Low-enriched U-22at% Mo (U–10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U–10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  5. Effects of Kinesio Taping versus McConnell Taping for Patellofemoral Pain Syndrome: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Wen-Dien Chang

    2015-01-01

    Full Text Available Objectives. To conduct a systematic review comparing the effects of Kinesio taping with McConnell taping as a method of conservative management of patients with patellofemoral pain syndrome (PFPS. Methods. MEDLINE, PUBMED, EMBASE, AMED, and the Cochrane Central Register of Control Trials electronic databases were searched through July 2014. Controlled studies evaluating the effects of Kinesio or McConnell taping in PFPS patients were retrieved. Results. Ninety-one articles were selected from the articles that were retrieved from the databases, and 11 articles were included in the analysis. The methods, evaluations, and results of the articles were collected, and the outcomes of patellar tapings were analyzed. Kinesio taping can reduce pain and increase the muscular flexibility of PFPS patients, and McConnell taping also had effect in pain relief and patellar alignment. Meta-analysis showed small effect in pain reduction and motor function improvement and moderate effect in muscle activity change among PFPS patients using Kinesio taping. Conclusions. Kinesio taping technique used for muscles can relieve pain but cannot change patellar alignment, unlike McConnell taping. Both patellar tapings are used differently for PFPS patients and substantially improve muscle activity, motor function, and quality of life.

  6. Structure, Texture and Phases in 3D Printed IN718 Alloy Subjected to Homogenization and HIP Treatments

    Directory of Open Access Journals (Sweden)

    Ahmad Mostafa

    2017-05-01

    Full Text Available 3D printing results in anisotropy in the microstructure and mechanical properties. The focus of this study is to investigate the structure, texture and phase evolution of the as-printed and heat treated IN718 superalloy. Cylindrical specimens, printed by powder-bed additive manufacturing technique, were subjected to two post-treatments: homogenization (1100 °C, 1 h, furnace cooling and hot isostatic pressing (HIP (1160 °C, 100 MPa, 4 h, furnace cooling. The Selective laser melting (SLM printed microstructure exhibited a columnar architecture, parallel to the building direction, due to the heat flow towards negative z-direction. Whereas, a unique structural morphology was observed in the x-y plane due to different cooling rates resulting from laser beam overlapping. Post-processing treatments reorganized the columnar structure of a strong {002} texture into fine columnar and/or equiaxed grains of random orientations. Equiaxed structure of about 150 µm average grain size, was achieved after homogenization and HIP treatments. Both δ-phase and MC-type brittle carbides, having rough morphologies, were formed at the grain boundaries. Delta-phase formed due to γ″-phase dissolution in the γ matrix, while MC-type carbides nucleates grew by diffusion of solute atoms. The presence of (Nb0.78Ti0.22C carbide phase, with an fcc structure having a lattice parameter a = 4.43 Å, was revealed using Energy dispersive spectrometer (EDS and X-ray diffractometer (XRD analysis. The solidification behavior of IN718 alloy was described to elucidate the evolution of different phases during selective laser melting and post-processing heat treatments of IN718.

  7. Gadolinium-doped ceria nanopowders synthesized by urea-based homogeneous co-precipitation (UBHP)

    Energy Technology Data Exchange (ETDEWEB)

    Accardo, G., E-mail: d16605@kist.re.kr [Fuel Cell Research Center, Korea Institute of Science and Technology, Hwarangno 14-gil, Seongbuk-gu, Seoul 136-791 (Korea, Republic of); Spiridigliozzi, L. [Department of Civil and Mechanical Engineering, INSTM Research Unit, University of Cassino and Southern Lazio, Via G. Di Biasio 43, 03043 Cassino, FR (Italy); Cioffi, R.; Ferone, C. [Department of Engineering, INSTM Research Unit, University Parthenope of Naples, Centro Direzionale, Is. C4, 80143 Napoli (Italy); Di Bartolomeo, E. [Department of Chemical Science and Technology, University of Rome “Tor Vergata”, Viale della Ricerca Scientifica, 00133 Rome (Italy); Yoon, Sung Pil [Fuel Cell Research Center, Korea Institute of Science and Technology, Hwarangno 14-gil, Seongbuk-gu, Seoul 136-791 (Korea, Republic of); Dell’Agli, G. [Department of Civil and Mechanical Engineering, INSTM Research Unit, University of Cassino and Southern Lazio, Via G. Di Biasio 43, 03043 Cassino, FR (Italy)

    2017-02-01

    Gadolinium (10%)-doped ceria was successfully synthesized by using an urea-based co-precipitation method (UBHP). A single fluorite phase was obtained after a low temperature (400 °C) calcination treatment. The resulting powders showed grains of nanometric size with some agglomerations and an overall good sinterability. Pellets were sintered at 1300 and 1500 °C for 3 h. The ionic conductivity was measured by electrochemical impedance spectroscopy measurements and a correlation between electrical properties and microstructure was revealed. The promising conductivity values showed that the synthesized powders are suitable for intermediate temperature solid oxide fuel cells (IT-SOFCs) applications. - Highlights: • Urea-based homogeneous co-precipitation is applied to synthesize nanocrystalline GDC. • Dense GDC samples at different sintering temperatures were characterized. • SEM and TEM revealed a well define microstructure and controlled composition. • Correlation between electrochemical properties by EIS and microstructure was discussed. • UBHP method can be used to prepare high performance GDC electrolytes.

  8. McXtrace

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Prodi, Andrea; Baltser, Jana

    2013-01-01

    to the standard X-ray simulation software SHADOW. McXtrace is open source, licensed under the General Public License, and does not require the user to have access to any proprietary software for its operation. The structure of the software is described in detail, and various examples are given to showcase...

  9. Edge-Based Image Compression with Homogeneous Diffusion

    Science.gov (United States)

    Mainberger, Markus; Weickert, Joachim

    It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.

  10. Target dose conversion modeling from pencil beam (PB) to Monte Carlo (MC) for lung SBRT

    International Nuclear Information System (INIS)

    Zheng, Dandan; Zhu, Xiaofeng; Zhang, Qinghui; Liang, Xiaoying; Zhen, Weining; Lin, Chi; Verma, Vivek; Wang, Shuo; Wahl, Andrew; Lei, Yu; Zhou, Sumin; Zhang, Chi

    2016-01-01

    A challenge preventing routine clinical implementation of Monte Carlo (MC)-based lung SBRT is the difficulty of reinterpreting historical outcome data calculated with inaccurate dose algorithms, because the target dose was found to decrease to varying degrees when recalculated with MC. The large variability was previously found to be affected by factors such as tumour size, location, and lung density, usually through sub-group comparisons. We hereby conducted a pilot study to systematically and quantitatively analyze these patient factors and explore accurate target dose conversion models, so that large-scale historical outcome data can be correlated with more accurate MC dose without recalculation. Twenty-one patients that underwent SBRT for early-stage lung cancer were replanned with 6MV 360° dynamic conformal arcs using pencil-beam (PB) and recalculated with MC. The percent D95 difference (PB-MC) was calculated for the PTV and GTV. Using single linear regression, this difference was correlated with the following quantitative patient indices: maximum tumour diameter (MaxD); PTV and GTV volumes; minimum distance from tumour to soft tissue (dmin); and mean density and standard deviation of the PTV, GTV, PTV margin, lung, and 2 mm, 15 mm, 50 mm shells outside the PTV. Multiple linear regression and artificial neural network (ANN) were employed to model multiple factors and improve dose conversion accuracy. Single linear regression with PTV D95 deficiency identified the strongest correlation on mean-density (location) indices, weaker on lung density, and the weakest on size indices, with the following R 2 values in decreasing orders: shell2mm (0.71), PTV (0.68), PTV margin (0.65), shell15mm (0.62), shell50mm (0.49), lung (0.40), dmin (0.22), GTV (0.19), MaxD (0.17), PTV volume (0.15), and GTV volume (0.08). A multiple linear regression model yielded the significance factor of 3.0E-7 using two independent features: mean density of shell2mm (P = 1.6E-7) and PTV volume

  11. Evaluation of the Eclipse eMC algorithm for bolus electron conformal therapy using a standard verification dataset.

    Science.gov (United States)

    Carver, Robert L; Sprunger, Conrad P; Hogstrom, Kenneth R; Popple, Richard A; Antolak, John A

    2016-05-08

    The purpose of this study was to evaluate the accuracy and calculation speed of electron dose distributions calculated by the Eclipse electron Monte Carlo (eMC) algorithm for use with bolus electron conformal therapy (ECT). The recent com-mercial availability of bolus ECT technology requires further validation of the eMC dose calculation algorithm. eMC-calculated electron dose distributions for bolus ECT have been compared to previously measured TLD-dose points throughout patient-based cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV (planning treatment volume) CT anatomy. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The treatment plans were imported into the Eclipse treatment planning system, and electron dose distributions calculated using 1% and processors (Intel Xeon E5-2690, 2.9 GHz) on a framework agent server (FAS). In comparison, the eMC was significantly more accurate than the pencil beam algorithm (PBA). The eMC has comparable accuracy to the pencil beam redefinition algorithm (PBRA) used for bolus ECT planning and has acceptably low dose calculation times. The eMC accuracy decreased when smoothing was used in high-gradient dose regions. The eMC accuracy was consistent with that previously reported for accuracy of the eMC electron dose algorithm and shows that the algorithm is suitable for clinical implementation of bolus ECT.

  12. Construction of Optimal-Path Maps for Homogeneous-Cost-Region Path-Planning Problems

    Science.gov (United States)

    1989-09-01

    of Artificial Inteligence , 9%,4. 24. Kirkpatrick, S., Gelatt Jr., C. D., and Vecchi, M. P., "Optinization by Sinmulated Ani- nealing", Science, Vol...studied in depth by researchers in such fields as artificial intelligence, robot;cs, and computa- tional geometry. Most methods require homogeneous...the results of the research. 10 U. L SLEVANT RESEARCH A. APPLICABLE CONCEPTS FROM ARTIFICIAL INTELLIGENCE 1. Search Methods One of the central

  13. McMYB10 regulates coloration via activating McF3'H and later structural genes in ever-red leaf crabapple.

    Science.gov (United States)

    Tian, Ji; Peng, Zhen; Zhang, Jie; Song, Tingting; Wan, Huihua; Zhang, Meiling; Yao, Yuncong

    2015-09-01

    The ever-red leaf trait, which is important for breeding ornamental and higher anthocyanin plants, rarely appears in Malus families, but little is known about the regulation of anthocyanin biosynthesis involved in the red leaves. In our study, HPLC analysis showed that the anthocyanin concentration in ever-red leaves, especially cyanidin, was significantly higher than that in evergreen leaves. The transcript level of McMYB10 was significantly correlated with anthocyanin synthesis between the 'Royalty' and evergreen leaf 'Flame' cultivars during leaf development. We also found the ever-red leaf colour cultivar 'Royalty' contained the known R6 : McMYB10 sequence, but was not in the evergreen leaf colour cultivar 'Flame', which have been reported in apple fruit. The distinction in promoter region maybe is the main reason why higher expression level of McMYB10 in red foliage crabapple cultivar. Furthermore, McMYB10 promoted anthocyanin biosynthesis in crabapple leaves and callus at low temperatures and during long-day treatments. Both heterologous expression in tobacco (Nicotiana tabacum) and Arabidopsis pap1 mutant, and homologous expression in crabapple and apple suggested that McMYB10 could promote anthocyanins synthesis and enhanced anthocyanin accumulation in plants. Interestingly, electrophoretic mobility shift assays, coupled with yeast one-hybrid analysis, revealed that McMYB10 positively regulates McF3'H via directly binding to AACCTAAC and TATCCAACC motifs in the promoter. To sum up, our results demonstrated that McMYB10 plays an important role in ever-red leaf coloration, by positively regulating McF3'H in crabapple. Therefore, our work provides new perspectives for ornamental fruit tree breeding. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  14. McSUB V2.0, an upgraded version of the Monte Carlo library McSUB with inclusion of weight factors

    International Nuclear Information System (INIS)

    Hoek, M.

    1991-02-01

    The Monte Carlo library McSUB, which was described in an earlier report, has been upgraded to McSUB V2.0. McSUB V2.0 can be used to simulate the neutron transport in a medium which is a mixture of hydrogen and carbon or a mixture of deuterium and carbon. The implemented neutron energy interval is 0.1 - 20 MeV and the library can be used to simulate elastic and inelastic scattering. The inelastic scattering with carbon takes into account the four lowest excited states of the carbon nucleus. McSUB V2.0 is downward compatible with McSUB expect for the layout of the parameter file which now contains more variables. The major upgrade has been the inclusion of routines using weight factors which has speeded up the old version considerably. McSUB V2.0 also makes a biasing technique possible. It is now possible to e.g. let a neutron scatter with a selected nucleus followed by a biased scattering direction. (au)

  15. The SPH homogeneization method

    International Nuclear Information System (INIS)

    Kavenoky, Alain

    1978-01-01

    The homogeneization of a uniform lattice is a rather well understood topic while difficult problems arise if the lattice becomes irregular. The SPH homogeneization method is an attempt to generate homogeneized cross sections for an irregular lattice. Section 1 summarizes the treatment of an isolated cylindrical cell with an entering surface current (in one velocity theory); Section 2 is devoted to the extension of the SPH method to assembly problems. Finally Section 3 presents the generalisation to general multigroup problems. Numerical results are obtained for a PXR rod bundle assembly in Section 4

  16. Association Between MC-2 Peptide and Hepatic Perfusion and Liver Injury Following Resuscitated Hemorrhagic Shock.

    Science.gov (United States)

    Matheson, Paul J; Fernandez-Botran, Rafael; Smith, Jason W; Matheson, Samuel A; Downard, Cynthia D; McClain, Craig J; Garrison, Richard N

    2016-03-01

    Hemorrhagic shock (HS) due to trauma remains a major cause of morbidity and mortality in the United States, despite continuing progression of advanced life support and treatment. Trauma is the third most common cause of death worldwide and is the leading cause of death in the 1- to 44-year-old age group. Hemorrhagic shock often progresses to multiple organ failure despite conventional resuscitation (CR) that restores central hemodynamics. To examine whether MC-2 would bind glycosaminoglycans to decrease proinflammatory cytokines' influence in the liver, minimize organ edema, prevent liver injury, and improve hepatic perfusion. MC-2, a synthetic octapeptide derived from the heparin-binding domain of murine interferon gamma (IFN-γ), binds glycosaminoglycans to modulate serum and interstitial cytokine levels and activity. A controlled laboratory study of 3y male Sprague-Dawley rats that were randomized to 4 groups of 8 each: sham, sham+MC-2 (50 mg/kg), HS/CR, or HS/CR+MC-2 (HS = 40% of baseline mean arterial pressure for 60 minutes; CR = return of shed blood and 2 volumes of saline). The study began in March, 2013. Effective hepatic blood flow (EHBF) by galactose clearance, wet-dry weights, cytokines, histopathology, complete metabolic panel, and complete blood cell count were performed at 4 hours after CR. MC-2 partially reversed the HS/CR-induced hepatic hypoperfusion at 3 and 4 hours postresuscitation compared with HS/CR alone. Effective hepatic blood flow decreased during the HS period from a mean (SD) of 7.4 (0.3) mL/min/100 g and 7.5 (0.5) mL/min/100g at baseline to 3.7 (0.4) mL/min/100g and 5.9 (0.5) mL/min/100g for the HS/CR and HS/CR+MC-2 groups, respectively (P hepatic blood flow remained constant in the sham groups throughout the experimental protocol. Organ edema was increased in the ileum and liver in the HS/CR vs sham group, and MC-2 decreased edema in the ileum vs the HS/CR group. MC-2 in HS also decreased levels of alanine aminotransferase

  17. Needs assessment of school and community physical activity opportunities in rural West Virginia: the McDowell CHOICES planning effort.

    Science.gov (United States)

    Kristjansson, Alfgeir L; Elliott, Eloise; Bulger, Sean; Jones, Emily; Taliaferro, Andrea R; Neal, William

    2015-04-03

    McDowell CHOICES (Coordinated Health Opportunities Involving Communities, Environments, and Schools) Project is a county wide endeavor aimed at increasing opportunities for physical activity (PA) in McDowell County, West Virginia (WV). A comprehensive needs-assessment laid the foundation of the project. During the 6 month needs assessment, multiple sources of data were collected in two Town Hall Meetings (n = 80); a student online PA interest survey (n = 465); a PA and nutrition survey among 5(th) (10-11 years) and 8(th) graders (13-14 years) with questions adapted from the CDC's Youth Risk Behavior Surveillance Survey (n = 442, response rate = 82.2%); six semi-structured school and community focus groups (n = 44); school site visits (n = 11); and BMI screening (n = 550, response rate = 69.7%). One third of children in McDowell County meet the national PA minimum of 60 minutes daily. At least 40% of 5(th) and 8(th) graders engage in electronic screen activity for 3 hours or more every day. The prevalence of obesity in 5(th) graders is higher in McDowell County than the rest of WV (~55% vs. 47% respectively). SWOT analyses of focus group data suggest an overall interest in PA but also highlight a need for increase in structured PA opportunities. Focus group data also suggested that a central communication (e.g. internet-based) platform would be beneficial to advertise and boost participation both in current and future programs. Schools were commonly mentioned as potential facilities for public PA participation throughout the county, both with regards to access and convenience. School site visits suggest that schools need more equipment and resources for before, during, and after school programs. An overwhelming majority of participants in the McDowell CHOICES needs assessment were interested to participate in more PA programs throughout the county as well as to improve opportunities for the provision of such programs. Public schools were widely recognized as the hub

  18. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    Science.gov (United States)

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  19. Methods study of homogeneity and stability test from cerium oxide CRM candidate

    International Nuclear Information System (INIS)

    Samin; Susanna TS

    2016-01-01

    The methods study of homogeneity and stability test from cerium oxide CRM candidate has been studied based on ISO 13258 and KAN DP. 01. 34. The purpose of this study was to select the test method homogeneity and stability tough on making CRM cerium oxide. Prepared 10 sub samples of cerium oxide randomly selected types of analytes which represent two compounds, namely CeO_2 and La_2O_3. At 10 sub sample is analyzed CeO_2 and La_2O_3 contents in duplicate with the same analytical methods, by the same analyst, and in the same laboratory. Data analysis results calculated statistically based on ISO 13528 and KAN DP.01.34. According to ISO 13528 Cerium Oxide samples said to be homogeneous if Ss ≤ 0.3 σ and is stable if | Xr – Yr | ≤ 0.3 σ. In this study, the data of homogeneity test obtained CeO_2 is Ss = 2.073 x 10-4 smaller than 0.3 σ (0.5476) and the stability test obtained | Xr - Yr | = 0.225 and the price is < 0.3 σ. Whereas for La_2O_3, the price for homogeneity test obtained Ss = 1.649 x 10-4 smaller than 0.3 σ (0.4865) and test the stability of the price obtained | Xr - Yr | = 0.2185 where the price is < 0.3 σ. Compared with the method from KAN, a sample of cerium oxide has also been homogenized for Fcalc < Ftable and stable, because | Xi - Xhm | < 0.3 x n IQR. Provided that the results of the evaluation homogeneity and stability test from CeO_2 CRM candidate test data were processed using statistical methods ISO 13528 is not significantly different with statistical methods from KAN DP.01.34, which together meet the requirements of a homogeneous and stable. So the test method homogeneity and stability test based on ISO 13528 can be used to make CRM cerium oxide. (author)

  20. Hybrid diffusion–transport spatial homogenization method

    International Nuclear Information System (INIS)

    Kooreman, Gabriel; Rahnema, Farzad

    2014-01-01

    Highlights: • A new hybrid diffusion–transport homogenization method. • An extension of the consistent spatial homogenization (CSH) transport method. • Auxiliary cross section makes homogenized diffusion consistent with heterogeneous diffusion. • An on-the-fly re-homogenization in transport. • The method is faster than fine-mesh transport by 6–8 times. - Abstract: A new hybrid diffusion–transport homogenization method has been developed by extending the consistent spatial homogenization (CSH) transport method to include diffusion theory. As in the CSH method, an “auxiliary cross section” term is introduced into the source term, making the resulting homogenized diffusion equation consistent with its heterogeneous counterpart. The method then utilizes an on-the-fly re-homogenization in transport theory at the assembly level in order to correct for core environment effects on the homogenized cross sections and the auxiliary cross section. The method has been derived in general geometry and tested in a 1-D boiling water reactor (BWR) core benchmark problem for both controlled and uncontrolled configurations. The method has been shown to converge to the reference solution with less than 1.7% average flux error in less than one third the computational time as the CSH method – 6 to 8 times faster than fine-mesh transport

  1. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  2. Homogenization of Stokes and Navier-Stokes equations

    International Nuclear Information System (INIS)

    Allaire, G.

    1990-04-01

    This thesis is devoted to homogenization of Stokes and Navier-Stokes equations with a Dirichlet boundary condition in a domain containing many tiny obstacles. Tipycally those obstacles are distributed at the modes of a periodic lattice with same small period in each axe's direction, and their size is always asymptotically smaller than the lattice's step. With the help of the energy method, and thanks to a suitable pressure's extension, we prove the convergence of the homogenization process when the lattice's step tends to zero (and thus the number of obstacles tends to infinity). For a so-called critical size of the obstacles, the homogenized problem turns out to be a Brinkman's law (i.e. Stokes or Navier-Stokes equation plus a linear zero-order term for the velocity in the momentum equation). For obstacles which have a size smaller than the critical one, the limit problem reduces to the initial Stokes or Navier-Stokes equations, while for larger sizes the homogenized problem a Darcy's law. Furthermore, those results have been extended to the case of obstacles included in a hyperplane, and we establish a simple model of fluid flows through grids, which is based on a special form of Brinkman's law [fr

  3. Single-core magnetic markers in rotating magnetic field based homogeneous bioassays and the law of mass action

    Energy Technology Data Exchange (ETDEWEB)

    Dieckhoff, Jan, E-mail: j.dieckhoff@tu-bs.de [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany); Schrittwieser, Stefan; Schotter, Joerg [Molecular Diagnostics, AIT Austrian Institute of Technology, Vienna (Austria); Remmer, Hilke; Schilling, Meinhard; Ludwig, Frank [Institut fuer Elektrische Messtechnik und Grundlagen der Elektrotechnik, TU Braunschweig, Braunschweig (Germany)

    2015-04-15

    In this work, we report on the effect of the magnetic nanoparticle (MNP) concentration on the quantitative detection of proteins in solution with a rotating magnetic field (RMF) based homogeneous bioassay. Here, the phase lag between 30 nm iron oxide single-core particles and the RMF is analyzed with a fluxgate-based measurement system. As a test analyte anti-human IgG is applied which binds to the protein G functionalized MNP shell and causes a change of the phase lag. The measured phase lag changes for a fixed MNP and a varying analyte concentration are modeled with logistic functions. A change of the MNP concentration results in a nonlinear shift of the logistic function with the analyte concentration. This effect results from the law of mass action. Furthermore, the bioassay results are used to determine the association constant of the binding reaction. - Highlights: • A rotating magnetic field based homogeneous bioassay concept was presented. • Here, single-core iron oxide nanoparticles are applied as markers. • The impact of the particle concentration on the bioassay results is investigated. • The relation between particle concentration and bioassay sensitivity is nonlinear. • This finding can be reasonably explained by the law of mass action.

  4. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    Science.gov (United States)

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Association between MC4R rs17782313 polymorphism and overeating behaviors.

    Science.gov (United States)

    Yilmaz, Z; Davis, C; Loxton, N J; Kaplan, A S; Levitan, R D; Carter, J C; Kennedy, J L

    2015-01-01

    Melanocortins have a crucial role in appetite and weight regulation. Although the melanocortin 4 receptor (MC4R) gene has been repeatedly linked to obesity and antipsychotic-induced weight gain, the mechanism behind how it leads to this effect in still undetermined. The goal of this study was to conduct an in-depth and sophisticated analysis of MC4R polymorphisms, body mass index (BMI), eating behavior and depressed mood. We genotyped 328 individuals of European ancestry on the following MC4R markers based on the relevant literature on obesity and antipsychotic-induced weight gain: rs571312, rs17782313, rs489693, rs11872992, and rs8087522. Height and weight were measured, and information on depressed mood and overeating behaviors was obtained during the in-person assessment. BMI was associated with rs17782313 C allele; however, this finding did not survive correction for multiple testing (P = 0.018). Although rs17782313 was significantly associated with depressed mood and overeating behaviors, tests of indirect effects indicated that emotional eating and food cravings, rather than depressed mood, uniquely accounted for the effect of this marker and BMI (n = 152). To our knowledge, this is the first study to investigate the link between MC4R rs17782313, mood and overeating behavior, as well as to demonstrate possible mechanisms behind MC4R's influence on body weight. If replicated in a larger sample, these results may have important clinical implications, including potential for the use of MC4R agonists in the treatment of obesity and disordered eating.

  6. Fractal systems of central places based on intermittency of space-filling

    International Nuclear Information System (INIS)

    Chen Yanguang

    2011-01-01

    Highlights: → The idea of intermittency is introduced into central place model. → The revised central place model suggests incomplete space filling. → New central place fractals are presented for urban analysis. → The average nearest distance is proposed to estimate the fractal dimension. → The concept of distance-based space is replaced by that of dimension-based space. - Abstract: The central place models are fundamentally important in theoretical geography and city planning theory. The texture and structure of central place networks have been demonstrated to be self-similar in both theoretical and empirical studies. However, the underlying rationale of central place fractals in the real world has not yet been revealed so far. This paper is devoted to illustrating the mechanisms by which the fractal patterns can be generated from central place systems. The structural dimension of the traditional central place models is d = 2 indicating no intermittency in the spatial distribution of human settlements. This dimension value is inconsistent with empirical observations. Substituting the complete space filling with the incomplete space filling, we can obtain central place models with fractional dimension D < d = 2 indicative of spatial intermittency. Thus the conventional central place models are converted into fractal central place models. If we further integrate the chance factors into the improved central place fractals, the theory will be able to explain the real patterns of urban places very well. As empirical analyses, the US cities and towns are employed to verify the fractal-based models of central places.

  7. Numerical computing of elastic homogenized coefficients for periodic fibrous tissue

    Directory of Open Access Journals (Sweden)

    Roman S.

    2009-06-01

    Full Text Available The homogenization theory in linear elasticity is applied to a periodic array of cylindrical inclusions in rectangular pattern extending to infinity in the inclusions axial direction, such that the deformation of tissue along this last direction is negligible. In the plane of deformation, the homogenization scheme is based on the average strain energy whereas in the third direction it is based on the average normal stress along this direction. Namely, these average quantities have to be the same on a Repeating Unit Cell (RUC of heterogeneous and homogenized media when using a special form of boundary conditions forming by a periodic part and an affine part of displacement. It exists an infinity of RUCs generating the considered array. The computing procedure is tested with different choices of RUC to control that the results of the homogenization process are independent of the kind of RUC we employ. Then, the dependence of the homogenized coefficients on the microstructure can be studied. For instance, a special anisotropy and the role of the inclusion volume are investigated. In the second part of this work, mechanical traction tests are simulated. We consider two kinds of loading, applying a density of force or imposing a displacement. We test five samples of periodic array containing one, four, sixteen, sixty-four and one hundred of RUCs. The evolution of mean stresses, strains and energy with the numbers of inclusions is studied. Evolutions depend on the kind of loading, but not their limits, which could be predicted by simulating traction test of the homogenized medium.

  8. MC++ and a transport physics framework

    International Nuclear Information System (INIS)

    Lee, S.R.; Cummings, J.C.; Nolen, S.D.; Keen, N.D.

    1997-01-01

    The Department of Energy has launched the Accelerated Strategic Computing Initiative (ASCI) to address a pressing need for more comprehensive computer simulation capabilities in the area of nuclear weapons safety and reliability. In light of the decision by the US Government to abandon underground nuclear testing, the Science-Based Stockpile Stewardship (SBSS) program is focused on using computer modeling to assure the continued safety and effectiveness of the nuclear stockpile. The authors believe that the utilization of object-oriented design and programming techniques can help in this regard. Object-oriented programming (OOP) has become a popular model in the general software community for several reasons. MC++ is a specific ASCI-relevant application project which demonstrates the effectiveness of the object-oriented approach. It is a Monte Carlo neutron transport code written in C++. It is designed to be simple yet flexible, with the ability to quickly introduce new numerical algorithms or representations of the physics into the code. MC++ is easily ported to various types of Unix workstations and parallel computers such as the three new ASCI platforms, largely because it makes extensive use of classes from the Parallel Object-Oriented Methods and Applications (POOMA) C++ class library. The MC++ code has been successfully benchmarked using some simple physics test problems, has been shown to provide comparable serial performance and a parallel efficiency superior to that of a well-known Monte Carlo neutronics package written in Fortran, and was the first ASCI-relevant application to run in parallel on all three ASCI computing platforms

  9. Altered regional homogeneity of spontaneous brain activity in idiopathic trigeminal neuralgia.

    Science.gov (United States)

    Wang, Yanping; Zhang, Xiaoling; Guan, Qiaobing; Wan, Lihong; Yi, Yahui; Liu, Chun-Feng

    2015-01-01

    The pathophysiology of idiopathic trigeminal neuralgia (ITN) has conventionally been thought to be induced by neurovascular compression theory. Recent structural brain imaging evidence has suggested an additional central component for ITN pathophysiology. However, far less attention has been given to investigations of the basis of abnormal resting-state brain activity in these patients. The objective of this study was to investigate local brain activity in patients with ITN and its correlation with clinical variables of pain. Resting-state functional magnetic resonance imaging data from 17 patients with ITN and 19 age- and sex-matched healthy controls were analyzed using regional homogeneity (ReHo) analysis, which is a data-driven approach used to measure the regional synchronization of spontaneous brain activity. Patients with ITN had decreased ReHo in the left amygdala, right parahippocampal gyrus, and left cerebellum and increased ReHo in the right inferior temporal gyrus, right thalamus, right inferior parietal lobule, and left postcentral gyrus (corrected). Furthermore, the increase in ReHo in the left precentral gyrus was positively correlated with visual analog scale (r=0.54; P=0.002). Our study found abnormal functional homogeneity of intrinsic brain activity in several regions in ITN, suggesting the maladaptivity of the process of daily pain attacks and a central role for the pathophysiology of ITN.

  10. The McClean Lake uranium project

    International Nuclear Information System (INIS)

    Blaise, J.R.

    2001-01-01

    The McClean Lake Uranium Project, located in the northern part of Saskatchewan, consists of five uranium deposits, Jeb - Sue A - Sue B - Sue C - McClean, scattered in three different locations on the mineral lease. On 16 March 1995, COGEMA Resources Inc and its partners, Denison Mines Ltd and OURD (Canada) Co Ltd, made the formal decision to develop the McClean Lake Project. Construction of the mine and mill started during summer 1995 and should be finished by mid 1997. Mining of the first deposit, Jeb started in 1996, ore being currently mined. The start of the yellowcake production is scheduled to start this fall. (author)

  11. Implatation of MC2 computer code

    International Nuclear Information System (INIS)

    Seehusen, J.; Nair, R.P.K.; Becceneri, J.C.

    1981-01-01

    The implantation of MC2 computer code in the CDC system is presented. The MC2 computer code calculates multigroup cross sections for tipical compositions of fast reactors. The multigroup constants are calculated using solutions of PI or BI approximations for determined buckling value as weighting function. (M.C.K.) [pt

  12. Age McCanni büroo = Offices of Age McCann

    Index Scriptorium Estoniae

    2010-01-01

    Tallinnas Rotermanni 8 asuva Age McCanni büroo sisekujundusest. Sisekujunduse autorid: sisearhitekt Kerli Valk (Kukuhaus OÜ) ja arhitekt Tomomi Hayashi (HG Arhitektuur OÜ), nende tähtsamate tööde loetelu

  13. Electro-magnetostatic homogenization of bianisotropic metamaterials

    OpenAIRE

    Fietz, Chris

    2012-01-01

    We apply the method of asymptotic homogenization to metamaterials with microscopically bianisotropic inclusions to calculate a full set of constitutive parameters in the long wavelength limit. Two different implementations of electromagnetic asymptotic homogenization are presented. We test the homogenization procedure on two different metamaterial examples. Finally, the analytical solution for long wavelength homogenization of a one dimensional metamaterial with microscopically bi-isotropic i...

  14. McGET: A rapid image-based method to determine the morphological characteristics of gravels on the Gobi desert surface

    Science.gov (United States)

    Mu, Yue; Wang, Feng; Zheng, Bangyou; Guo, Wei; Feng, Yiming

    2018-03-01

    The relationship between morphological characteristics (e.g. gravel size, coverage, angularity and orientation) and local geomorphic features (e.g. slope gradient and aspect) of desert has been used to explore the evolution process of Gobi desert. Conventional quantification methods are time-consuming, inefficient and even prove impossible to determine the characteristics of large numbers of gravels. We propose a rapid image-based method to obtain the morphological characteristics of gravels on the Gobi desert surface, which is called the "morphological characteristics gained effectively technique" (McGET). The image of the Gobi desert surface was classified into gravel clusters and background by a machine-learning "classification and regression tree" (CART) algorithm. Then gravel clusters were segmented into individual gravel clasts by separating objects in images using a "watershed segmentation" algorithm. Thirdly, gravel coverage, diameter, aspect ratio and orientation were calculated based on the basic principles of 2D computer graphics. We validated this method with two independent datasets in which the gravel morphological characteristics were obtained from 2728 gravels measured in the field and 7422 gravels measured by manual digitization. Finally, we applied McGET to derive the spatial variation of gravel morphology on the Gobi desert along an alluvial-proluvial fan located in Hami, Xinjiang, China. The validated results show that the mean gravel diameter measured in the field agreed well with that calculated by McGET for large gravels (R2 = 0.89, P < 0.001). Compared to manual digitization, the McGET accuracies for gravel coverage, gravel diameter and aspect ratio were 97%, 83% and 96%, respectively. The orientation distributions calculated were consistent across two different methods. More importantly, McGET significantly shortens the time cost in obtaining gravel morphological characteristics in the field and laboratory. The spatial variation results

  15. MC and A system design workshop

    International Nuclear Information System (INIS)

    Schneider, R.A.; Harms, N.L.

    1984-01-01

    The workshop had as its goal the development of a Material Control and Accounting (MC and A) system for a low enriched uranium fuel fabrication plant. The factors to be considered for each of the ten key elements of the safeguards (MC and A) are presented in the text for the session

  16. Applications of a systematic homogenization theory for nodal diffusion methods

    International Nuclear Information System (INIS)

    Zhang, Hong-bin; Dorning, J.J.

    1992-01-01

    The authors recently have developed a self-consistent and systematic lattice cell and fuel bundle homogenization theory based on a multiple spatial scales asymptotic expansion of the transport equation in the ratio of the mean free path to the reactor characteristics dimension for use with nodal diffusion methods. The mathematical development leads naturally to self-consistent analytical expressions for homogenized diffusion coefficients and cross sections and flux discontinuity factors to be used in nodal diffusion calculations. The expressions for the homogenized nuclear parameters that follow from the systematic homogenization theory (SHT) are different from those for the traditional flux and volume-weighted (FVW) parameters. The calculations summarized here show that the systematic homogenization theory developed recently for nodal diffusion methods yields accurate values for k eff and assembly powers even when compared with the results of a fine mesh transport calculation. Thus, it provides a practical alternative to equivalence theory and GET (Ref. 3) and to simplified equivalence theory, which requires auxiliary fine-mesh calculations for assemblies embedded in a typical environment to determine the discontinuity factors and the equivalent diffusion coefficient for a homogenized assembly

  17. McClean Lake. Site Guide

    International Nuclear Information System (INIS)

    2016-09-01

    Located over 700 kilometers northeast of Saskatoon, Areva's McClean Lake site is comprised of several uranium mines and one of the most technologically advanced uranium mills in the world - the only mill designed to process high-grade uranium ore without dilution. Areva has operated several open-pit uranium mines at the McClean Lake site, and is evaluating future mines at and near the site. The McClean Lake mill has recently undergone a multimillion-dollar upgrade and expansion, which has doubled its annual production capacity of uranium concentrate to 24 million pounds. It is the only facility in the world capable of processing high-grade uranium ore without diluting it. The mill processes the ore from the Cigar Lake mine, the world's second largest and highest-grade uranium mine. The McClean Lake site operates 365 days a year on a week-in/week-out rotation schedule for workers, over 50% of whom reside in northern Saskatchewan communities. Tailings are waste products resulting from milling uranium ore. This waste is made up of leach residue solids, waste solutions and chemical precipitates that are carefully engineered for long-term disposal. The TMF serves as the repository for all resulting tailings. This facility allows proper waste management, which minimizes potential adverse environmental effects. Mining projections indicate that the McClean Lake mill will produce tailings in excess of the existing capacity of the TMF. After evaluating a number of options, Areva has decided to pursue an expansion of this facility. Areva is developing the Surface Access Borehole Resource Extraction (SABRE) mining method, which uses a high-pressure water jet placed at the bottom of the drill hole to extract ore. Areva has conducted a series of tests with this method and is evaluating its potential for future mining operations. McClean Lake maintains its certification in ISO 14001 standards for environmental management and OHSAS 18001 standards for occupational health

  18. Pensamento comunicacional canadense: as contribuições de Innis e McLuhan

    Directory of Open Access Journals (Sweden)

    Luiz C. Martino

    2009-09-01

    Full Text Available Resumo O presente trabalho visa apresentar o pensamento comunicacional canadense. Sustenta a idéia de Harold Innis como pioneiro, entendendo que sua obra, mais do que uma simples teoria, comporta um programa de pesquisa (no sentido de Imre Lakatos. Define o núcleo duro do programa innisiano, como: a a ação da técnica nos processos de comunicação e b a centralidade dos meios de comunicação para entender a organização social. Teses que estão na base de um projeto de incomparável valor epistemológico, visto que constituiriam uma possibilidade para o fundamento do próprio saber comunicacional. Palavras-chave: Pensamento comunicacional canadense; teoria da comunicação; Harold Innis; Marshall McLuhan; Escola de Toronto de Comunicação. Resumen El presente trabajo visa presentar el pensamiento comunicacional canadiense. Sustenta la idea de Harold Innis como pionero, entendiendo que su obra, más que una simple teoría, comporta un programa de investigación (en el sentido de Imre Lakatos. Define el núcleo duro del programa innisiano como: a la acción de la técnica en los procesos de comunicación y b la centralidad de los medios de comunicación para entender la organización social. Tesis que constituyen la base de un proyecto de incomparable valor epistemológico, visto que constituirían una posibilidad para el fundamento del propio saber comunicacional. Palabras-clave: Pensamiento comunicacional canadiense; teoría de la comunicación; Harold Innis; Marshall McLuhan; Toronto School of Communication (Escuela de Comunicación de Toronto. Abstract This article presents the main lines of thought within Canadian Communication. It sustains the thesis of Harold Innis as pioneer, understanding that his work, more than just a simple theory, it admits a research program (such as employed by Imre Lakatos. It defines as the hard core for the Innisian program: a the action of technique on communication processes and b the centrality of

  19. Assessment of ethylene dibromide, dibromochloropropane, other volatile organic compounds, radium isotopes, radon, and inorganic compounds in groundwater and spring water from the Crouch Branch and McQueen Branch aquifers near McBee, South Carolina, 2010-2012

    Science.gov (United States)

    Landmeyer, James E.; Campbell, Bruce G.

    2014-01-01

    Public-supply wells near the rural town of McBee, in southwestern Chesterfield County, South Carolina, have provided potable water to more than 35,000 residents throughout Chesterfield County since the early 1990s. Groundwater samples collected between 2002 and 2008 in the McBee area by South Carolina Department of Health and Environmental Control (DHEC) officials indicated that groundwater from two public-supply wells was characterized by the anthropogenic compounds ethylene dibromide (EDB) and dibromochloropropane (DBCP) at concentrations that exceeded their respective maximum contaminant levels (MCLs) established by the U.S. Environmental Protection Agency’s (EPA) National Primary Drinking Water Regulations (NPDWR). Groundwater samples from all public-supply wells in the McBee area were characterized by the naturally occurring isotopes of radium-226 and radium-228 at concentrations that approached, and in one well exceeded, the MCL for the combined isotopes. The local water utility installed granulated activated carbon filtration units at the two EDB- and DBCP-contaminated wells and has, since 2011, shut down these two wells. Groundwater pumped by the remaining public-supply wells is currently (2014) centrally treated at a water-filtration plant.

  20. Array-based assay detects genome-wide 5-mC and 5-hmC in the brains of humans, non-human primates, and mice.

    Science.gov (United States)

    Chopra, Pankaj; Papale, Ligia A; White, Andrew T J; Hatch, Andrea; Brown, Ryan M; Garthwaite, Mark A; Roseboom, Patrick H; Golos, Thaddeus G; Warren, Stephen T; Alisch, Reid S

    2014-02-13

    Methylation on the fifth position of cytosine (5-mC) is an essential epigenetic mark that is linked to both normal neurodevelopment and neurological diseases. The recent identification of another modified form of cytosine, 5-hydroxymethylcytosine (5-hmC), in both stem cells and post-mitotic neurons, raises new questions as to the role of this base in mediating epigenetic effects. Genomic studies of these marks using model systems are limited, particularly with array-based tools, because the standard method of detecting DNA methylation cannot distinguish between 5-mC and 5-hmC and most methods have been developed to only survey the human genome. We show that non-human data generated using the optimization of a widely used human DNA methylation array, designed only to detect 5-mC, reproducibly distinguishes tissue types within and between chimpanzee, rhesus, and mouse, with correlations near the human DNA level (R(2) > 0.99). Genome-wide methylation analysis, using this approach, reveals 6,102 differentially methylated loci between rhesus placental and fetal tissues with pathways analysis significantly overrepresented for developmental processes. Restricting the analysis to oncogenes and tumor suppressor genes finds 76 differentially methylated loci, suggesting that rhesus placental tissue carries a cancer epigenetic signature. Similarly, adapting the assay to detect 5-hmC finds highly reproducible 5-hmC levels within human, rhesus, and mouse brain tissue that is species-specific with a hierarchical abundance among the three species (human > rhesus > mouse). Annotation of 5-hmC with respect to gene structure reveals a significant prevalence in the 3'UTR and an association with chromatin-related ontological terms, suggesting an epigenetic feedback loop mechanism for 5-hmC. Together, these data show that this array-based methylation assay is generalizable to all mammals for the detection of both 5-mC and 5-hmC, greatly improving the utility of mammalian model systems

  1. Alfven-ion-cyclotron instability in the central cell of TMX

    International Nuclear Information System (INIS)

    Watson, D.C.; Baldwin, D.E.

    1977-01-01

    The central cell of TMX may require hot-ion injection. The resulting velocity-space anisotropy together with the length of the central homogeneous region raise the possibility of convective AIC instability. In this report we demonstrate that the Rosenbluth criterion of less than a thousand-fold amplitude amplification per pass can be satisfied by ion distributions which nevertheless have sufficient anisotropy to be confined within the central cell

  2. Reliability of the Danish version of the McGill ingestive skills assessment for observation-based measures during meals

    DEFF Research Database (Denmark)

    Hansen, Tina; Lambert, Heather C; Faber, Jens

    2012-01-01

    To establish measurement equivalence in terms of reliability of the Danish version of the Canadian McGill ingestive skills assessment (MISA) for use by occupational therapists.......To establish measurement equivalence in terms of reliability of the Danish version of the Canadian McGill ingestive skills assessment (MISA) for use by occupational therapists....

  3. TWO FERROMAGNETIC SPHERES IN HOMOGENEOUS MAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    Yury A. Krasnitsky

    2018-01-01

    Full Text Available The problem of two spherical conductors is studied quite in detail with bispherical coordinates usage and has numerous appendices in an electrostatics. The boundary-value problem about two ferromagnetic spheres enclosed on homogeneous and infinite environment in which the lack of spheres exists like homogeneous magnetic field is considered. The solution of Laplace's equation in the bispherical system of coordinates allows us to find the potential and field distribution in all spaces, including area between spheres. The boundary conditions in potential continuity and in ordinary density constituent of spheres surfaces induction flux are used. It is supposed that spheres are identical, and magnetic permeability of their material is expressed in  >> 0. The problem about falling of electromagnetic plane wave on the system of two spheres, which possesses electrically small sizes, can be considered as quasistationary. The scalar potentials received as a result of Laplace's equation solution are represented by the series containing Legendre polynomials. The concept of two spheres system effective permeability is introduced. It is equal to the advantage in magnitude of magnetic induction flux vector through a certain system’s section arising due to its magnetic properties. Necessary ratios for the effective permeability referred to the central system’s section are obtained. Particularly, the results can be used during the analysis of ferroxcube core clearance, which influences on the magnetic antenna properties. 

  4. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    International Nuclear Information System (INIS)

    Moutsopoulos, George

    2013-01-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre–Petrov types and discuss the warped de Sitter spacetime. (paper)

  5. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    Science.gov (United States)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  6. Benchmarking homogenization algorithms for monthly data

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  7. Heterogeneity to Homogeneity: Synthesis, Base Pairing, and Ligation Studies of 4',3'-XyluloNA/RNA and TNA/RNA Chimeric Sequences

    Science.gov (United States)

    Bhowmik, S.; Stoop, M.; Krishnamurthy, R.

    2017-07-01

    Based on the reality of "prebiotic clutter," we herein present an alternate model for pre-RNA to RNA transition, which starts, not with homogeneous-backbone system, but rather with mixtures of heterogeneous-backbone of chimeric "pre-RNA/RNA."

  8. Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems

    Science.gov (United States)

    Leuschner, Matthias; Fritzen, Felix

    2017-11-01

    Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.

  9. Intracortical stiffness of mid-diaphysis femur bovine bone: lacunar-canalicular based homogenization numerical solutions and microhardness measurements.

    Science.gov (United States)

    Hage, Ilige S; Hamade, Ramsey F

    2017-09-01

    Microscale lacunar-canalicular (L-C) porosity is a major contributor to intracortical bone stiffness variability. In this work, such variability is investigated experimentally using micro hardness indentation tests and numerically using a homogenization scheme. Cross sectional rings of cortical bones are cut from the middle tubular part of bovine femur long bone at mid-diaphysis. A series of light microscopy images are taken along a line emanating from the cross-section center starting from the ring's interior (endosteum) ring surface toward the ring's exterior (periosteum) ring surface. For each image in the line, computer vision analysis of porosity is conducted employing an image segmentation methodology based on pulse coupled neural networks (PCNN) recently developed by the authors. Determined are size and shape of each of the lacunar-canalicular (L-C) cortical micro constituents: lacunae, canaliculi, and Haversian canals. Consequently, it was possible to segment and quantify the geometrical attributes of all individual segmented pores leading to accurate determination of derived geometrical measures such as L-C cortical pores' total porosity (pore volume fraction), (elliptical) aspect ratio, orientation, location, and number of pores in secondary and primary osteons. Porosity was found to be unevenly (but linearly) distributed along the interior and exterior regions of the intracortical bone. The segmented L-C porosity data is passed to a numerical microscale-based homogenization scheme, also recently developed by the authors, that analyses a composite made up of lamella matrix punctuated by multi-inclusions and returns corresponding values for longitudinal and transverse Young's modulus (matrix stiffness) for these micro-sized spatial locations. Hence, intracortical stiffness variability is numerically quantified using a combination of computer vision program and numerical homogenization code. These numerically found stiffness values of the homogenization

  10. Genetics Home Reference: McLeod neuroacanthocytosis syndrome

    Science.gov (United States)

    ... Castiglioni C, Oechsner M, Goebel HH, Heppner FL, Jung HH. McLeod myopathy revisited: more neurogenic and less ... 130(Pt 12):3285-96. Citation on PubMed Jung HH, Danek A, Frey BM. McLeod syndrome: a ...

  11. Association between MC4R rs17782313 Polymorphism and Overeating Behaviours

    Science.gov (United States)

    Yilmaz, Zeynep; Davis, Caroline; Loxton, Natalie J.; Kaplan, Allan S.; Levitan, Robert D.; Carter, Jacqueline C.; Kennedy, James L.

    2014-01-01

    Background/Objectives Melanocortins play a crucial role in appetite and weight regulation. Although the melanocortin 4 receptor (MC4R) gene has been repeatedly linked to obesity and antipsychotic-induced weight gain, the mechanism behind how it leads to this effect in still undetermined. The goal of this study was to conduct an in-depth and sophisticated analysis of MC4R polymorphisms, body mass index (BMI), eating behaviour, and depressed mood. Subjects/Methods We genotyped 328 individuals of European ancestry on the following MC4R markers based on the relevant literature on obesity and antipsychotic-induced weight gain: rs571312, rs17782313, rs489693, rs11872992, and rs8087522. Height and weight were measured, and information on depressed mood and overeating behaviours was obtained during the in-person assessment. Results BMI was associated with rs17782313 C allele; however this finding did not survive correction for multiple testing (p=0.018). Although rs17782313 was significantly associated with depressed mood and overeating behaviours, tests of indirect effects indicated that emotional eating and food cravings, rather than depressed mood, uniquely accounted for the effect of this marker and BMI (n=152). Conclusions To our knowledge, this is the first study to investigate the link between MC4R rs17782313, mood and overeating behaviour, as well as to demonstrate possible mechanisms behind MC4R’s influence on body weight. If replicated in a larger sample, these results may have important clinical implications, including potential for the use of MC4R agonists in the treatment of obesity and disordered eating. PMID:24827639

  12. Comparison between McMaster and Mini-FLOTAC methods for the enumeration of Eimeria maxima oocysts in poultry excreta.

    Science.gov (United States)

    Bortoluzzi, C; Paras, K L; Applegate, T J; Verocai, G G

    2018-04-30

    Monitoring Eimeria shedding has become more important due to the recent restrictions to the use of antibiotics within the poultry industry. Therefore, there is a need for the implementation of more precise and accurate quantitative diagnostic techniques. The objective of this study was to compare the precision and accuracy between the Mini-FLOTAC and the McMaster techniques for quantitative diagnosis of Eimeria maxima oocyst in poultry. Twelve pools of excreta samples of broiler chickens experimentally infected with E. maxima were analyzed for the comparison between Mini-FLOTAC and McMaster technique using, the detection limits (dl) of 23 and 25, respectively. Additionally, six excreta samples were used to compare the precision of different dl (5, 10, 23, and 46) using the Mini-FLOTAC technique. For precision comparisons, five technical replicates of each sample (five replicate slides on one excreta slurry) were read for calculating the mean oocyst per gram of excreta (OPG) count, standard deviation (SD), coefficient of variation (CV), and precision of both aforementioned comparisons. To compare accuracy between the methods (McMaster, and Mini-FLOTAC dl 5 and 23), excreta from uninfected chickens was spiked with 100, 500, 1,000, 5,000, or 10,000 OPG; additional samples remained unspiked (negative control). For each spiking level, three samples were read in triplicate, totaling nine reads per spiking level per technique. Data were transformed using log10 to obtain normality and homogeneity of variances. A significant correlation (R = 0.74; p = 0.006) was observed between the mean OPG of the McMaster dl 25 and the Mini-FLOTAC dl 23. Mean OPG, CV, SD, and precision were not statistically different between the McMaster dl 25 and Mini-FLOTAC dl 23. Despite the absence of statistical difference (p > 0.05), Mini-FLOTAC dl 5 showed a numerically lower SD and CV than Mini-FLOTAC dl 23. The Pearson correlation coefficient revealed significant and positive

  13. Homogeneous nucleation limit on the bulk formation of metallic glasses

    International Nuclear Information System (INIS)

    Drehman, A.J.

    1983-01-01

    Glassy Pd 82 Si 18 spheres, of up to 1 mm diameter, were formed in a drop tube filled with He gas. The largest spheres were successfully cooled to a glass using a cooling rate of less than 800 K/sec. Even at this low cooling rate, crystallization (complete or partial) was the result of heterogeneous nucleation at a high temperature, relative to the temperature at which copious homogeneous nucleation would commence. Bulk underscoring experiments demonstrated that this alloy could be cooled to 385 K below its eutectic melting temperature (1083 K) without the occurrence of crystallization. If heterogeneous nucleation can be avoided, it is estimated that a cooling rate of at most 100 K/sec would be required to form this alloy in the glassy state. Ingots of glassy Pd 40 Ni 40 P 20 were formed from the liquid by cooling at a rate of only 1 K/sec. It was found that glassy samples of this alloy could be heated well above the glass transition temperature without the occurrence of rapid divitrification. This is a result due, in part of the low density of pre-existing nuclei, but, more importantly, due to the low homogeneous nucleation rate and the slow crystal growth kinetics. Based on the observed devitrification kinetics, the steady-state homogeneous nucleation rate is approximately 1 nuclei/cm 3 sec at 590 K (the temperature at which the homogeneous nucleation rate is estimated to be a maximum). Two iron-nickel based glass-forming alloys (Fe 40 Ni 40 P 14 B 6 and Fe 40 Ni 40 B 20 , were not successfully formed into glassy spheres, however, microstructural examination indicates that crystallization was not the result of copious homogeneous nucleation. In contrast, glass forming iron based alloys (Fe 80 B 20 and Fe/sub 79.3/B/sub 16.4/Si/sub 4.0/C/sub 0.3/) exhibit copious homogeneous nucleation when cooled at approximately the same rate

  14. Monte Carlo dose calculation improvements for low energy electron beams using eMC

    International Nuclear Information System (INIS)

    Fix, Michael K; Frei, Daniel; Volken, Werner; Born, Ernst J; Manser, Peter; Neuenschwander, Hans

    2010-01-01

    The electron Monte Carlo (eMC) dose calculation algorithm in Eclipse (Varian Medical Systems) is based on the macro MC method and is able to predict dose distributions for high energy electron beams with high accuracy. However, there are limitations for low energy electron beams. This work aims to improve the accuracy of the dose calculation using eMC for 4 and 6 MeV electron beams of Varian linear accelerators. Improvements implemented into the eMC include (1) improved determination of the initial electron energy spectrum by increased resolution of mono-energetic depth dose curves used during beam configuration; (2) inclusion of all the scrapers of the applicator in the beam model; (3) reduction of the maximum size of the sphere to be selected within the macro MC transport when the energy of the incident electron is below certain thresholds. The impact of these changes in eMC is investigated by comparing calculated dose distributions for 4 and 6 MeV electron beams at source to surface distance (SSD) of 100 and 110 cm with applicators ranging from 6 x 6 to 25 x 25 cm 2 of a Varian Clinac 2300C/D with the corresponding measurements. Dose differences between calculated and measured absolute depth dose curves are reduced from 6% to less than 1.5% for both energies and all applicators considered at SSD of 100 cm. Using the original eMC implementation, absolute dose profiles at depths of 1 cm, d max and R50 in water lead to dose differences of up to 8% for applicators larger than 15 x 15 cm 2 at SSD 100 cm. Those differences are now reduced to less than 2% for all dose profiles investigated when the improved version of eMC is used. At SSD of 110 cm the dose difference for the original eMC version is even more pronounced and can be larger than 10%. Those differences are reduced to within 2% or 2 mm with the improved version of eMC. In this work several enhancements were made in the eMC algorithm leading to significant improvements in the accuracy of the dose calculation

  15. Monte Carlo dose calculation improvements for low energy electron beams using eMC.

    Science.gov (United States)

    Fix, Michael K; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter

    2010-08-21

    The electron Monte Carlo (eMC) dose calculation algorithm in Eclipse (Varian Medical Systems) is based on the macro MC method and is able to predict dose distributions for high energy electron beams with high accuracy. However, there are limitations for low energy electron beams. This work aims to improve the accuracy of the dose calculation using eMC for 4 and 6 MeV electron beams of Varian linear accelerators. Improvements implemented into the eMC include (1) improved determination of the initial electron energy spectrum by increased resolution of mono-energetic depth dose curves used during beam configuration; (2) inclusion of all the scrapers of the applicator in the beam model; (3) reduction of the maximum size of the sphere to be selected within the macro MC transport when the energy of the incident electron is below certain thresholds. The impact of these changes in eMC is investigated by comparing calculated dose distributions for 4 and 6 MeV electron beams at source to surface distance (SSD) of 100 and 110 cm with applicators ranging from 6 x 6 to 25 x 25 cm(2) of a Varian Clinac 2300C/D with the corresponding measurements. Dose differences between calculated and measured absolute depth dose curves are reduced from 6% to less than 1.5% for both energies and all applicators considered at SSD of 100 cm. Using the original eMC implementation, absolute dose profiles at depths of 1 cm, d(max) and R50 in water lead to dose differences of up to 8% for applicators larger than 15 x 15 cm(2) at SSD 100 cm. Those differences are now reduced to less than 2% for all dose profiles investigated when the improved version of eMC is used. At SSD of 110 cm the dose difference for the original eMC version is even more pronounced and can be larger than 10%. Those differences are reduced to within 2% or 2 mm with the improved version of eMC. In this work several enhancements were made in the eMC algorithm leading to significant improvements in the accuracy of the dose

  16. Effect of Interleaved FEC Code on Wavelet Based MC-CDMA System with Alamouti STBC in Different Modulation Schemes

    OpenAIRE

    Shams, Rifat Ara; Kabir, M. Hasnat; Ullah, Sheikh Enayet

    2012-01-01

    In this paper, the impact of Forward Error Correction (FEC) code namely Trellis code with interleaver on the performance of wavelet based MC-CDMA wireless communication system with the implementation of Alamouti antenna diversity scheme has been investigated in terms of Bit Error Rate (BER) as a function of Signal-to-Noise Ratio (SNR) per bit. Simulation of the system under proposed study has been done in M-ary modulation schemes (MPSK, MQAM and DPSK) over AWGN and Rayleigh fading channel inc...

  17. McKenzie River Subbasin Assessment, Summary Report 2000.

    Energy Technology Data Exchange (ETDEWEB)

    Alsea Geospatial, Inc.

    2000-02-01

    This document summarizes the findings of the McKenzie River Subbasin Assessment: Technical Report. The subbasin assessment tells a story about the McKenzie River watershed. What is the McKenzie's ecological history, how is the McKenzie doing today, and where is the McKenzie watershed headed ecologically? Knowledge is a good foundation for action. The more we know, the better prepared we are to make decisions about the future. These decisions involve both protecting good remaining habitat and repairing some of the parts that are broken in the McKenzie River watershed. The subbasin assessment is the foundation for conservation strategy and actions. It provides a detailed ecological assessment of the lower McKenzie River and floodplain, identifies conservation and restoration opportunities, and discusses the influence of some upstream actions and processes on the study area. The assessment identifies restoration opportunities at the reach level. In this study, a reach is a river segment from 0.7 to 2.7 miles long and is defined by changes in land forms, land use, stream junctions, and/or cultural features. The assessment also provides flexible tools for setting priorities and planning projects. The goal of this summary is to clearly and concisely extract the key issues, findings, and recommendations from the full-length Technical Report. The high priority recommended action items highlight areas that the McKenzie Watershed Council can significantly influence, and that will likely yield the greatest ecological benefit. People are encouraged to read the full Technical Report if they are interested in the detailed methods, findings, and references used in this study.

  18. Homogenization via the strong-permittivity-fluctuation theory with nonzero depolarization volume

    Science.gov (United States)

    Mackay, Tom G.

    2004-08-01

    The depolarization dyadic provides the scattering response of a single inclusion particle embedded within a homogenous background medium. These dyadics play a central role in formalisms used to estimate the effective constitutive parameters of homogenized composite mediums (HCMs). Conventionally, the inclusion particle is taken to be vanishingly small; this allows the pointwise singularity of the dyadic Green function associated with the background medium to be employed as the depolarization dyadic. A more accurate approach is pursued in this communication by taking into account the nonzero spatial extent of inclusion particles. Depolarization dyadics corresponding to inclusion particles of nonzero volume are incorporated within the strong-permittivity-fluctuation theory (SPFT). The linear dimensions of inclusion particles are assumed to be small relative to the electromagnetic wavelength(s) and the SPFT correlation length. The influence of the size of inclusion particles upon SPFT estimates of the HCM constitutive parameters is investigated for anisotropic dielectric HCMs.In particular, the interplay between correlation length and inclusion size is explored.

  19. Homogenization scheme for acoustic metamaterials

    KAUST Repository

    Yang, Min

    2014-02-26

    We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.

  20. Henry P. McKean Jr. selecta

    CERN Document Server

    Moerbeke, Pierre; Moll, Victor

    2015-01-01

    This volume presents a selection of papers by Henry P. McKean, which illustrate the various areas in mathematics in which he has made seminal contributions. Topics covered include probability theory, integrable systems, geometry and financial mathematics. Each paper represents a contribution by Prof. McKean, either alone or together with other researchers, that has had a profound influence in the respective area.

  1. Homogenization of resonant chiral metamaterials

    OpenAIRE

    Andryieuski, Andrei; Menzel, Christoph; Rockstuhl, Carsten; Malureanu, Radu; Lederer, Falk; Lavrinenko, Andrei

    2010-01-01

    Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as e.g. propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size a critical density exists above which increasing coupling between neighboring meta-atoms prevails a reasonable homogenization. On the contrary, a dilution in excess will induce features reminiscent to pho...

  2. Quantum Dot-Based Luminescent Oxygen Channeling Assay for Potential Application in Homogeneous Bioassays.

    Science.gov (United States)

    Zhuang, Si-Hui; Guo, Xin-Xin; Wu, Ying-Song; Chen, Zhen-Hua; Chen, Yao; Ren, Zhi-Qi; Liu, Tian-Cai

    2016-01-01

    The unique photoproperties of quantum dots are promising for potential application in bioassays. In the present study, quantum dots were applied to a luminescent oxygen channeling assay. The reaction system developed in this study was based on interaction of biotin with streptavidin. Carboxyl-modified polystyrene microspheres doped with quantum dots were biotinylated and used as acceptors. Photosensitizer-doped carboxyl-modified polystyrene microspheres were conjugated with streptavidin and used as donors. The results indicated that the singlet oxygen that was released from the donor beads diffused into the acceptor beads. The acceptor beads were then exited via thioxene, and were subsequently fluoresced. To avoid generating false positives, a high concentration (0.01 mg/mL) of quantum dots is required for application in homogeneous immunoassays. Compared to a conventional luminescent oxygen channeling assay, this quantum dots-based technique requires less time, and would be easier to automate and miniaturize because it requires no washing to remove excess labels.

  3. The OpenMC Monte Carlo particle transport code

    International Nuclear Information System (INIS)

    Romano, Paul K.; Forget, Benoit

    2013-01-01

    Highlights: ► An open source Monte Carlo particle transport code, OpenMC, has been developed. ► Solid geometry and continuous-energy physics allow high-fidelity simulations. ► Development has focused on high performance and modern I/O techniques. ► OpenMC is capable of scaling up to hundreds of thousands of processors. ► Results on a variety of benchmark problems agree with MCNP5. -- Abstract: A new Monte Carlo code called OpenMC is currently under development at the Massachusetts Institute of Technology as a tool for simulation on high-performance computing platforms. Given that many legacy codes do not scale well on existing and future parallel computer architectures, OpenMC has been developed from scratch with a focus on high performance scalable algorithms as well as modern software design practices. The present work describes the methods used in the OpenMC code and demonstrates the performance and accuracy of the code on a variety of problems.

  4. Cosmic Ray Hit Detection with Homogenous Structures

    Science.gov (United States)

    Smirnov, O. M.

    Cosmic ray (CR) hits can affect a significant number of pixels both on long-exposure ground-based CCD observations and on the Space Telescope frames. Thus, methods of identifying the damaged pixels are an important part of the data preprocessing for practically any application. The paper presents an implementation of a CR hit detection algorithm based on a homogenous structure (also called cellular automata ), a concept originating in artificial intelligence and dicrete mathematics. Each pixel of the image is represented by a small automaton, which interacts with its neighbors and assumes a distinct state if it ``decides'' that a CR hit is present. On test data, the algorithm has shown a high detection rate (~0.7 ) and a low false alarm rate (frame. A homogenous structure is extremely trainable, which can be very important for processing large batches of data obtained under similar conditions. Training and optimizing issues are discussed, as well as possible other applications of this concept to image processing.

  5. Homogenized group cross sections by Monte Carlo

    International Nuclear Information System (INIS)

    Van Der Marck, S. C.; Kuijper, J. C.; Oppe, J.

    2006-01-01

    Homogenized group cross sections play a large role in making reactor calculations efficient. Because of this significance, many codes exist that can calculate these cross sections based on certain assumptions. However, the application to the High Flux Reactor (HFR) in Petten, the Netherlands, the limitations of such codes imply that the core calculations would become less accurate when using homogenized group cross sections (HGCS). Therefore we developed a method to calculate HGCS based on a Monte Carlo program, for which we chose MCNP. The implementation involves an addition to MCNP, and a set of small executables to perform suitable averaging after the MCNP run(s) have completed. Here we briefly describe the details of the method, and we report on two tests we performed to show the accuracy of the method and its implementation. By now, this method is routinely used in preparation of the cycle to cycle core calculations for HFR. (authors)

  6. NTS MC and A History

    International Nuclear Information System (INIS)

    Mary Alice Price; Kim Young

    2008-01-01

    Within the past three and a half years, the Nevada Test Site (NTS) has progressed from a Category IV to a Category I nuclear material facility. In accordance with direction from the U.S. Department of Energy (DOE) Secretary and National Nuclear Security Administration (NNSA) Administrator, NTS received shipments of large quantities of special nuclear material from Los Alamos National Laboratory (LANL) and other sites in the DOE complex. December 2004 was the first occurrence of Category I material at the NTS, with the exception of two weeks of sub-critical underground testing in 2001, since 1992. The Material Control and Accountability (MC and A) program was originally a jointlab effort by LANL, Lawrence Livermore National Laboratory, and Bechtel Nevada, but in March 2006 the NNSA Nevada Site Office appointed the NTS Management and Operations contractor with sole responsibility. This paper will discuss the process and steps taken to transition the NTS MC and A program from multiple organizations to a single entity and from a Category IV to a Category I program. This transition flourished as MC and A progressed from the 2004 Office of Assessment (OA) rating of 'Significant Weakness' to the 2007 OA assessment rating of 'Effective Performance'. The paper will provide timelines, funding and staffing issues, OA assessment findings and corrective actions, and future expectations. The process has been challenging, but MC and A's innovative responses to the challenges have been very successful

  7. A Malus crabapple chalcone synthase gene, McCHS, regulates red petal color and flavonoid biosynthesis.

    Directory of Open Access Journals (Sweden)

    Deqiang Tai

    Full Text Available Chalcone synthase is a key and often rate-limiting enzyme in the biosynthesis of anthocyanin pigments that accumulate in plant organs such as flowers and fruits, but the relationship between CHS expression and the petal coloration level in different cultivars is still unclear. In this study, three typical crabapple cultivars were chosen based on different petal colors and coloration patterns. The two extreme color cultivars, 'Royalty' and 'Flame', have dark red and white petals respectively, while the intermediate cultivar 'Radiant' has pink petals. We detected the flavoniods accumulation and the expression levels of McCHS during petals expansion process in different cultivars. The results showed McCHS have their special expression patterns in each tested cultivars, and is responsible for the red coloration and color variation in crabapple petals, especially for color fade process in 'Radiant'. Furthermore, tobacco plants constitutively expressing McCHS displayed a higher anthocyanins accumulation and a deeper red petal color compared with control untransformed lines. Moreover, the expression levels of several anthocyanin biosynthetic genes were higher in the transgenic McCHS overexpressing tobacco lines than in the control plants. A close relationship was observed between the expression of McCHS and the transcription factors McMYB4 and McMYB5 during petals development in different crabapple cultivars, suggesting that the expression of McCHS was regulated by these transcription factors. We conclude that the endogenous McCHS gene is a critical factor in the regulation of anthocyanin biosynthesis during petal coloration in Malus crabapple.

  8. Recent developments in the ROCS/MC code for retrieving local power information in coarse-mesh reactor analysis

    International Nuclear Information System (INIS)

    Grill, S.F.; Jonsson, A.; Crump, M.W.

    1983-01-01

    The inclusion of 3-D effects in PWR analysis is necessary for accurate predictions of reactivity, power distributions, and reactivity coefficients. The ROCS/MC code system has been developed by Combustion Engineering to provide 3-D coarse mesh analysis (ROCS) with the capability to retrieve local information on flux, power and burnup (MC). A review of the finite difference representation of the MC diffusion equation, along with recent improvements to the ROCS/MC system are presented. These improvements include the implementation if fine mesh radial boundary conditions and internal calculation of coarse mesh boundary conditions, generalization of the imbedded calculation to account for the local neighboring environment, and the automation of ROCS/MC links to C-E's code system for in-core power distribution monitoring and core-follow analysis. The results of the ROCS/MC verification program are described and show good agreement with C-E's ROCS/PDQ based methodologies

  9. McKenzie Classification of Extremity Lesions - An audit of primary care patients in 3 clinics

    DEFF Research Database (Denmark)

    Melbye, Martin

    2007-01-01

    Syndrome classification based on mechanical testing guides clinical decision making in conservative musculoskeletal care. The aim of this audit was to investigate how many patients presenting with problems in the extremities could be classified into the mechanical syndromes described by Robin Mc...... ranged from 4,5 to 6 years. The mechanical classification  determined by the therapists,  and was recorded on the first three visits. Mechanical classification was based on strict operational definitions. Assessment sheets were collected from each therapist, to determine their adherence...... to the operational definitions. 135 consecutive patients were included over an 18 months period and 28 patients were excluded. Of  the 107 patients with extremity joint problems, 73% were classified into one of McKenzie's mechanical syndromes by therapists trained in the McKenzie method. 34% of patients were...

  10. Homogenization of neutronic diffusion models

    International Nuclear Information System (INIS)

    Capdebosq, Y.

    1999-09-01

    In order to study and simulate nuclear reactor cores, one needs to access the neutron distribution in the core. In practice, the description of this density of neutrons is given by a system of diffusion equations, coupled by non differential exchange terms. The strong heterogeneity of the medium constitutes a major obstacle to the numerical computation of this models at reasonable cost. Homogenization appears as compulsory. Heuristic methods have been developed since the origin by nuclear physicists, under a periodicity assumption on the coefficients. They consist in doing a fine computation one a single periodicity cell, to solve the system on the whole domain with homogeneous coefficients, and to reconstruct the neutron density by multiplying the solutions of the two computations. The objectives of this work are to provide mathematically rigorous basis to this factorization method, to obtain the exact formulas of the homogenized coefficients, and to start on geometries where two periodical medium are placed side by side. The first result of this thesis concerns eigenvalue problem models which are used to characterize the state of criticality of the reactor, under a symmetry assumption on the coefficients. The convergence of the homogenization process is proved, and formulas of the homogenized coefficients are given. We then show that without symmetry assumptions, a drift phenomenon appears. It is characterized by the mean of a real Bloch wave method, which gives the homogenized limit in the general case. These results for the critical problem are then adapted to the evolution model. Finally, the homogenization of the critical problem in the case of two side by side periodic medium is studied on a one dimensional on equation model. (authors)

  11. Homogenization of compacted blends of Ni and Mo powders

    International Nuclear Information System (INIS)

    Lanam, R.D.; Yeh, F.C.H.; Rovsek, J.E.; Smith, D.W.; Heckel, R.W.

    1975-01-01

    The homogenization behavior of compacted blends of Ni and Mo powders was studied primarily as a function of temperature, mean compact composition, and Mo powder particle size. All compact compositions were in the Ni-rich terminal solid-solution range; temperatures were between 950 and 1200 0 C (in the region of the phase diagram where only the Mo--Ni intermediate phase forms); average Mo particle sizes ranged from 8.4 mu m to 48 mu m. Homogenization was characterized in terms of the rate of decrease of the amounts of the Mo-rich terminal solid-solution phase and the Mo--Ni intermediate phase. The experimental results were compared to predictions based upon the three-phase, concentric-sphere homogenization model. In general, agreement between experimental data and model predictions was fairly good for high-temperature treatments and for compact compositions which were not close to the solubility limit of Mo in Ni. Departures from the model are discussed in terms of surface diffusion contributions to homogenization and non-uniform mixing effects. (U.S.)

  12. Manual del McVCO 1999

    Science.gov (United States)

    McChesney, P.J.

    1999-01-01

    El McVCO es un generador de frecuencias basado en un microcontrolador que reemplaza al oscilador controlado por voltaje (VCO) utilizado en telemetría analógica de datos sísmicas. Acepta señales de baja potencia desde un sismómetro y produce una señal subportadora modulada en frecuencia adecuada para enlaces telefónicos o vía radio a un lugar remoto de recolección de datos. La frecuencia de la subportadora y la ganancia pueden ser seleccionadas mediante un interruptor. Tiene la opción de poder operar con dos canales para la observación con ganancia alta y baja. El McVCO fue diseñado con el propósito de mejorar la telemetría analógica de las señales dentro de la Pacific Northwest Seismograph Network (PNSN) (Red Sismográfica del Noroeste del Pacífico). Su desarrollo recibió el respaldo del Programa de Geofísica de la Universidad de Washington y del "Volcano Hazards and Earthquake Hazards programs of the United States Geological Survey (USGS) (Programa de Investigaciones de Riesgos Volcánicos y Programa de Investigaciones de Riesgos Sísmicos de los EEUU). Cientos de instrumentos se han construido e instalado. Además de utilizarlo el PNSN, el McVCO es usado por el Observatorio Vulcanológico de Alaska para monitorear los volcanes aleutianos y por el USGS Volcano Disaster Assistance Program (Programa de Ayuda en las Catástrofes Volcánicas del USGS) para responder a crisis volcánicas en otros países. Este manual cubre el funcionamiento del McVCO, es una referencia técnica para aquellos que necesitan saber con más detalle cómo funciona el McVCO, y cubre una serie de temas que requieren un trato explícito o que derivan del despliegue del instrumento.

  13. Installation Restoration Program Records Search for McClellan Air Force Base, California.

    Science.gov (United States)

    1981-07-01

    lakes and Dursban insecticide in standing waters. Adult mosquito control is accomplished by dispersion of malathion insecticide from Ultra-Low Volume...Solanum nigrum Typhaceae Cattail family Common cattail Typha latifolia L F9 Family Common Name Scientific Name Urticaceae Creek nettle Urtica...USED AT McCLELLAN AFB 1. Insecticides Diazionon 4E 47.5% Conc. Emulsion Diazinon 0.5% Oil Solution Malathion 57% Conc. Emulsion Malathion 95% Tech

  14. J.B. McLachlan: a biography

    Energy Technology Data Exchange (ETDEWEB)

    Frank, D.

    1999-07-01

    This social history and biography of James Bryson McLaughlin (1869-1916) describes McLaughlin's leadership as an educator and instigator in organizing Nova Scotia's coal miners during the labour wars in the 1920s. McLaughlin's background and childhood, education, reputation, religion, family life, health, and death are described. Included are descriptions of the life of coal miners in Cape Breton, radical left politics in Canada and the organizers involved, the political economy of the coal industry, child labour, churches, coal markets and prices, company towns and housing, mining disasters and fatalities, elections, First World War efforts, the depression, immigrants, and strikes. The labour organizations, companies, churches, and politicians involved in the struggles for union acceptance are discussed. 872 refs., 7 figs., 24 photos.

  15. Three Dimensional Spherical Display Systems and McIDAS: Tools for Science, Education and Outreach

    Science.gov (United States)

    Kohrs, R.; Mooney, M. E.

    2010-12-01

    The Space Science and Engineering Center (SSEC) and Cooperative Institute for Meteorological Satellite Studies (CIMSS) at the University of Wisconsin are now using a 3D spherical display system and their Man computer Data Access System (McIDAS)-X and McIDAS-V as outreach tools to demonstrate how scientists and forecasters utilize satellite imagery to monitor weather and climate. Our outreach program displays orbits and data coverage of geostationary and polar satellites and demonstrates how each is beneficial for the remote sensing of Earth. Global composites of visible, infrared and water vapor images illustrate how satellite instruments collect data from different bands of the electromagnetic spectrum to monitor global weather patterns 24 hours a day. Captivating animations on spherical display systems are proving to be much more intuitive than traditional 2D displays, enabling audiences to view satellites orbiting above real-time weather systems circulating the entire globe. Complimenting the 3D spherical display system are the UNIX-based McIDAS-X and Java-based McIDAS-V software packages. McIDAS is used to composite the real-time global satellite data and create other weather related derived products. Client and server techniques used by these software packages provide the opportunity to continually update the real-time content on our globe. The enhanced functionality of McIDAS-V extends our outreach program by allowing in-depth interactive 4-dimensional views of the imagery previously viewed on the 3D spherical display system. An important goal of our outreach program is the promotion of remote sensing research and technology at SSEC and CIMSS. The 3D spherical display system has quickly become a popular tool to convey societal benefits of these endeavors. Audiences of all ages instinctively relate to recent weather events which keeps them engaged in spherical display presentations. McIDAS facilitates further exploration of the science behind the weather

  16. A Haphazard Reading of McHugh and Barlow (2010)

    Science.gov (United States)

    McHugh, R. Kathryn; Barlow, David H.

    2010-01-01

    Replies to comments on Do haphazard reviews provide sound directions for dissemination efforts? by Eileen Gambrill and Julia H. Littell on the current authors' article The dissemination and implementation of evidence-based psychological treatments: A review of current efforts by Kathryn R. McHugh and David H. Barlow. In their commentary, Gambrill…

  17. Virtue ethics and the selection of children with impairments: a reply to Rosalind McDougall.

    Science.gov (United States)

    Saenz, Carla

    2010-11-01

    In 'Parental Virtues: A New Way of Thinking about the Morality of Reproductive Actions' Rosalind McDougall proposes a virtue-based framework to assess the morality of child selection. Applying the virtue-based account to the selection of children with impairments does not lead, according to McDougall, to an unequivocal answer to the morality of selecting impaired children. In 'Impairment, Flourishing, and the Moral Nature of Parenthood,' she also applies the virtue-based account to the discussion of child selection, and claims that couples with an impairment are morally justified in selecting a child with the same impairment. This claim, she maintains, reveals that the flourishing of a child should be understood as requiring environment-specific characteristics. I argue that McDougall's argument begs the question. More importantly, it does not do justice to virtue ethics. I also question to what extent a virtue ethics framework can be successfully applied to discussions about the moral permissibility of reproductive actions. © 2009 Blackwell Publishing Ltd.

  18. Virtue Ethics and the Selection of Children with Impairments A Reply to Rosalind McDougall

    Science.gov (United States)

    Saenz, Carla

    2009-01-01

    In “Parental Virtues: A New Way of Thinking about the Morality of Reproductive Actions” Rosalind McDougall proposes a virtue-based framework to assess the morality of child selection. Applying the virtue-based account to the selection of children with impairments does not lead, according to McDougall, to an unequivocal answer to the morality of selecting impaired children. In “Impairment, Flourishing, and the Moral Nature of Parenthood,” she also applies the virtue-based account to the discussion of child selection, and claims that couples with an impairment are morally justified in selecting a child with the same impairment. This claim, she maintains, reveals that the flourishing of a child should be understood as requiring environment-specific characteristics. I argue that McDougall’s argument begs the question. More importantly, it does not do justice to virtue ethics. I also question to what extent a virtue ethics framework can be successfully applied to discussions about the moral permissibility of reproductive actions. PMID:19508307

  19. Homogeneity evaluation of mesenchymal stem cells based on electrotaxis analysis

    OpenAIRE

    Kim, Min Sung; Lee, Mi Hee; Kwon, Byeong-Ju; Kim, Dohyun; Koo, Min-Ah; Seon, Gyeung Mi; Park, Jong-Chul

    2017-01-01

    Stem cell therapy that can restore function to damaged tissue, avoid host rejection and reduce inflammation throughout body without use of immunosuppressive drugs. The established methods were used to identify and to isolate specific stem cell markers by FACS or by immunomagnetic cell separation. The procedures for distinguishing population of stem cells took a time and needed many preparations. Here we suggest an electrotaxis analysis as a new method to evaluate the homogeneity of mesenchyma...

  20. Analytical solutions of time-fractional models for homogeneous Gardner equation and non-homogeneous differential equations

    Directory of Open Access Journals (Sweden)

    Olaniyi Samuel Iyiola

    2014-09-01

    Full Text Available In this paper, we obtain analytical solutions of homogeneous time-fractional Gardner equation and non-homogeneous time-fractional models (including Buck-master equation using q-Homotopy Analysis Method (q-HAM. Our work displays the elegant nature of the application of q-HAM not only to solve homogeneous non-linear fractional differential equations but also to solve the non-homogeneous fractional differential equations. The presence of the auxiliary parameter h helps in an effective way to obtain better approximation comparable to exact solutions. The fraction-factor in this method gives it an edge over other existing analytical methods for non-linear differential equations. Comparisons are made upon the existence of exact solutions to these models. The analysis shows that our analytical solutions converge very rapidly to the exact solutions.

  1. Shape optimization in biomimetics by homogenization modelling

    International Nuclear Information System (INIS)

    Hoppe, Ronald H.W.; Petrova, Svetozara I.

    2003-08-01

    Optimal shape design of microstructured materials has recently attracted a great deal of attention in material science. The shape and the topology of the microstructure have a significant impact on the macroscopic properties. The present work is devoted to the shape optimization of new biomorphic microcellular ceramics produced from natural wood by biotemplating. We are interested in finding the best material-and-shape combination in order to achieve the optimal prespecified performance of the composite material. The computation of the effective material properties is carried out using the homogenization method. Adaptive mesh-refinement technique based on the computation of recovered stresses is applied in the microstructure to find the homogenized elasticity coefficients. Numerical results show the reliability of the implemented a posteriori error estimator. (author)

  2. Developing standard performance testing procedures for MC and A components at a site

    International Nuclear Information System (INIS)

    Scherer, Carolynn

    2010-01-01

    The condition of a nuclear material control and accountability system (MC and A) and its individual components, as with any system combining technical elements, documentation and the human factor, may be characterized through an aggregate of values for the various parameters that determine the system's ability to perform. The MC and A system's status may be functioning effectively, marginally or not functioning based on a summary of the values of the individual parameters. This work included a review of the following elements and subsystems or components for a material control and accountability system: (1) MC and A Elements: Information subsystem, Measurement subsystem, NM access subsystem, including a tamper-indicating device (TID) program, and Automated information-gathering subsystem; and (2) Detecting NM Loses Elements: Inventory differences, Shipper/receiver differences, Confirmatory measurements and differences with accounting data, and TID or seal violations. In order to detect the absence or loss of nuclear material there must be appropriate interactions among the elements and their respective subsystems (from the list above). Additionally this work includes a review of the status of regulatory requirements for the MC and A system components and potential criteria that support the evaluation of the performance of the listed components. The listed components had performance testing algorithms and procedures developed that took into consideration the regulatory criteria. The developed MC and A performance-testing procedures were the basis for a pilot Guide for MC and A Performance Testing at the MBAs of SSC RF IPPE.

  3. Some issues raised by the National Academy of Sciences study for MC and A

    International Nuclear Information System (INIS)

    Tingey, F.H.

    1988-01-01

    Six of several issues raised by the recent DOE commissioned National Academy of Science Study for MC and A are briefly discussed in part along with resulting recommendations for DOE consideration. The activities discussed are: 1. Evaluating System Performance; 2. Management of Inventory Differences; 3. Management of Shipper-Receiver Differences; 4. Managing the MC and A Data Base; 5. Management of Threat and Vulnerability; and 6. MSSA Content and Preparation

  4. Application of quality assurance to MC and A systems

    International Nuclear Information System (INIS)

    Skinner, A.J.; Delvin, W.L.

    1986-01-01

    Application of the principles of quality assurance to MC and A has been done at DOE's Savannah River Operations Office. The principles were applied to the functions within the MC and A Branch, including both the functions used to operate the Branch and those used to review the MC and A activities of DOE/SR's contractor. The purpose of this paper is to discuss that application of quality assurance and to show how the principles of quality assurance relate to the functions of a MC and A system, for both a DOE field office and a contractor. The principles (presented as requirements from the NQA-1 standard) are briefly discussed, a method for applying quality assurance is outlined, application at DOE/SR is shown, and application to a contractor's MC and A system is discussed

  5. Homogeneous Spaces and Equivariant Embeddings

    CERN Document Server

    Timashev, DA

    2011-01-01

    Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em

  6. How to determine composite material properties using numerical homogenization

    DEFF Research Database (Denmark)

    Andreassen, Erik; Andreasen, Casper Schousboe

    2014-01-01

    Numerical homogenization is an efficient way to determine effective macroscopic properties, such as the elasticity tensor, of a periodic composite material. In this paper an educational description of the method is provided based on a short, self-contained Matlab implementation. It is shown how...... the basic code, which computes the effective elasticity tensor of a two material composite, where one material could be void, is easily extended to include more materials. Furthermore, extensions to homogenization of conductivity, thermal expansion, and fluid permeability are described in detail. The unit...

  7. The Jurassic section along McElmo Canyon in southwestern Colorado

    Science.gov (United States)

    O'Sullivan, Robert B.

    1997-01-01

    In McElmo Canyon, Jurassic rocks are 1500-1600 ft thick. Lower Jurassic rocks of the Glen Canyon Group include (in ascending order) Wingate Sandstone, Kayenta Formation and Navajo Sandstone. Middle Jurassic rocks are represented by the San Rafael Group, which includes the Entrada Sandstone and overlying Wanakah Formation. Upper Jurassic rocks comprise the Junction Creek Sandstone overlain by the Morrison Formation. The Burro Canyon Formation, generally considered to be Lower Cretaceous, may be Late Jurassic in the McElmo Canyon area and is discussed with the Jurassic. The Upper Triassic Chinle Formation in the subsurface underlies, and the Upper Cretaceous Dakota Sandstone overlies, the Jurassic section. An unconformity is present at the base of the Glen Canyon Group (J-0), at the base of the San Rafael Group (J-2), and at the base of the Junction Creek Sandstone (J-5). Another unconformity of Cretaceous age is at the base of the Dakota Sandstone. Most of the Jurassic rocks consist of fluviatile, lacustrine and eolian deposits. The basal part of the Entrada Sandstone and the Wanakah Formation may be of marginal marine origin.

  8. Stochastic approach to municipal solid waste landfill life based on the contaminant transit time modeling using the Monte Carlo (MC) simulation

    International Nuclear Information System (INIS)

    Bieda, Bogusław

    2013-01-01

    The paper is concerned with application and benefits of MC simulation proposed for estimating the life of a modern municipal solid waste (MSW) landfill. The software Crystal Ball® (CB), simulation program that helps analyze the uncertainties associated with Microsoft® Excel models by MC simulation, was proposed to calculate the transit time contaminants in porous media. The transport of contaminants in soil is represented by the one-dimensional (1D) form of the advection–dispersion equation (ADE). The computer program CONTRANS written in MATLAB language is foundation to simulate and estimate the thickness of landfill compacted clay liner. In order to simplify the task of determining the uncertainty of parameters by the MC simulation, the parameters corresponding to the expression Z2 taken from this program were used for the study. The tested parameters are: hydraulic gradient (HG), hydraulic conductivity (HC), porosity (POROS), linear thickness (TH) and diffusion coefficient (EDC). The principal output report provided by CB and presented in the study consists of the frequency chart, percentiles summary and statistics summary. Additional CB options provide a sensitivity analysis with tornado diagrams. The data that was used include available published figures as well as data concerning the Mittal Steel Poland (MSP) S.A. in Kraków, Poland. This paper discusses the results and show that the presented approach is applicable for any MSW landfill compacted clay liner thickness design. -- Highlights: ► Numerical simulation of waste in porous media is proposed. ► Statistic outputs based on correct assumptions about probability distribution are presented. ► The benefits of a MC simulation are examined. ► The uniform probability distribution is studied. ► I report a useful tool applied to determine the life of a modern MSW landfill.

  9. Stochastic approach to municipal solid waste landfill life based on the contaminant transit time modeling using the Monte Carlo (MC) simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bieda, Boguslaw, E-mail: bbieda@zarz.agh.edu.pl

    2013-01-01

    The paper is concerned with application and benefits of MC simulation proposed for estimating the life of a modern municipal solid waste (MSW) landfill. The software Crystal Ball Registered-Sign (CB), simulation program that helps analyze the uncertainties associated with Microsoft Registered-Sign Excel models by MC simulation, was proposed to calculate the transit time contaminants in porous media. The transport of contaminants in soil is represented by the one-dimensional (1D) form of the advection-dispersion equation (ADE). The computer program CONTRANS written in MATLAB language is foundation to simulate and estimate the thickness of landfill compacted clay liner. In order to simplify the task of determining the uncertainty of parameters by the MC simulation, the parameters corresponding to the expression Z2 taken from this program were used for the study. The tested parameters are: hydraulic gradient (HG), hydraulic conductivity (HC), porosity (POROS), linear thickness (TH) and diffusion coefficient (EDC). The principal output report provided by CB and presented in the study consists of the frequency chart, percentiles summary and statistics summary. Additional CB options provide a sensitivity analysis with tornado diagrams. The data that was used include available published figures as well as data concerning the Mittal Steel Poland (MSP) S.A. in Krakow, Poland. This paper discusses the results and show that the presented approach is applicable for any MSW landfill compacted clay liner thickness design. -- Highlights: Black-Right-Pointing-Pointer Numerical simulation of waste in porous media is proposed. Black-Right-Pointing-Pointer Statistic outputs based on correct assumptions about probability distribution are presented. Black-Right-Pointing-Pointer The benefits of a MC simulation are examined. Black-Right-Pointing-Pointer The uniform probability distribution is studied. Black-Right-Pointing-Pointer I report a useful tool applied to determine the life of a

  10. Time and situatedness Merleau-Ponty,s Response to McTaggart,s Paradox

    Directory of Open Access Journals (Sweden)

    Claudio Javier Cormick

    2014-12-01

    Full Text Available The article seeks to establish a relationship that has not yet been explored satisfactorily between Merleau-Ponty᾿s phenomenology of time and a central issue of analytical “theory of time”: McTaggart᾿s paradox. By clarifying the authentic meaning of Merleau-Ponty᾿s “subjectivism” regarding time, against Priest᾿s interpretation (1998, the article points out a convergence between the phenomenological approach and Michael Dummett᾿s theses developed in response to the abovementioned paradox. A “situational” solution to the paradox is attempted on the basis of Dummett᾿s ideas and Bimbenet᾿s interpretation of Merleau-Ponty᾿s “perspectivism”.

  11. Computationally Probing the Performance of Hybrid, Heterogeneous, and Homogeneous Iridium-Based Catalysts for Water Oxidation

    Energy Technology Data Exchange (ETDEWEB)

    García-Melchor, Max [SUNCAT Center for Interface Science and Catalysis, Department of Chemical Engineering, Stanford University, Stanford CA (United States); Vilella, Laia [Institute of Chemical Research of Catalonia (ICIQ), The Barcelona Institute of Science and Technology (BIST),Tarragona (Spain); Departament de Quimica, Universitat Autonoma de Barcelona, Barcelona (Spain); López, Núria [Institute of Chemical Research of Catalonia (ICIQ), The Barcelona Institute of Science and Technology (BIST), Tarragona (Spain); Vojvodic, Aleksandra [SUNCAT Center for Interface Science and Catalysis, SLAC National Accelerator Laboratory, Menlo Park CA (United States)

    2016-04-29

    An attractive strategy to improve the performance of water oxidation catalysts would be to anchor a homogeneous molecular catalyst on a heterogeneous solid surface to create a hybrid catalyst. The idea of this combined system is to take advantage of the individual properties of each of the two catalyst components. We use Density Functional Theory to determine the stability and activity of a model hybrid water oxidation catalyst consisting of a dimeric Ir complex attached on the IrO2(110) surface through two oxygen atoms. We find that homogeneous catalysts can be bound to its matrix oxide without losing significant activity. Hence, designing hybrid systems that benefit from both the high tunability of activity of homogeneous catalysts and the stability of heterogeneous systems seems feasible.

  12. Fast Food McDonald's China Fix

    Institute of Scientific and Technical Information of China (English)

    DAVID HENDRICKSON

    2006-01-01

    @@ Since the opening of its first outlet 16 years ago, McDonald's China operation has on many levels proven enormously successful.Home to more than 750 locations nationwide, the Middle Kingdom today ranks as one of McDonald's ten largest markets,with returns hovering in doubles digits and raking in billions annually. As lucrative as it may be, however, China has nonetheless developed into a relative sore spot for the world's leading fast food giant.

  13. Internal homogenization: effective permittivity of a coated sphere.

    Science.gov (United States)

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  14. Mysteries of Mind and Matter in the Perspective of Colin McGinn’s Philosophy

    Directory of Open Access Journals (Sweden)

    Dmytro Sepetyi

    2016-02-01

    Full Text Available The paper discusses the approach to the mind-body problem that was developed by Colin McGinn and is known as “mysterianism”. The basic thesis of this approach, which McGinn opposes to both materialism and dualism, is that consciousness and its relationship to physical reality is an inscrutable mystery that cannot be get over in principle, because of insurmountable, constitutive limitations of human mind. In this paper, the critical analysis of McGinn’s approach is given. It is pointed out that McGinn’s thesis of the inscrutablility of the mind-body relationship is based on the ungrounded idea that a mediating substance is necessary. An argument is made that the mind-body relationship is better understood as interaction that is mediated only by natural psychophysical laws. McGinn’s hypothesis of the origin of physical space in a non-spatial reality that has the same nature as consciousness is explained in the context of different interpretations of the Big Bang theo- ry. It is argued that McGinn’s hypothesis does not provide a solution to the problem of the origin of human consciousness, because consciousness belongs to mental individuals whose emergence is just as unexplainable by the hypothesis of non-spatial reality as it is by physical processes.

  15. Homogenization methods for heterogeneous assemblies

    International Nuclear Information System (INIS)

    Wagner, M.R.

    1980-01-01

    The third session of the IAEA Technical Committee Meeting is concerned with the problem of homogenization of heterogeneous assemblies. Six papers will be presented on the theory of homogenization and on practical procedures for deriving homogenized group cross sections and diffusion coefficients. That the problem of finding so-called ''equivalent'' diffusion theory parameters for the use in global reactor calculations is of great practical importance. In spite of this, it is fair to say that the present state of the theory of second homogenization is far from being satisfactory. In fact, there is not even a uniquely accepted approach to the problem of deriving equivalent group diffusion parameters. Common agreement exists only about the fact that the conventional flux-weighting technique provides only a first approximation, which might lead to acceptable results in certain cases, but certainly does not guarantee the basic requirement of conservation of reaction rates

  16. Different Aspects of Regional Development in East-Central Europe

    Directory of Open Access Journals (Sweden)

    PÁL SZABÓ

    2014-12-01

    Full Text Available The aim of this paper is to explore and analyze the main characteristics of East-Central Europe’s spatial structure, including its changes during the recent years. In many territorial researches, there is an intention to define different types of regions and to establish territorial regularities, create models, etc. In this case, we analysed the regions of East-Central Europe based on their comprehensive socio-economic data and described the most important characteristics of the spatial structure of this macroregion from different perspectives. Some results show that the social and economic core areas are highly separated from each other and the development “image” of East-Central Europe has remained the same viewing from the aspects of bigger, homogenous areas, but became more mosaic with the appearance of some separated and improving regions, strengthening the model of the “Bunch of Grapes”, not the "Boomerang". Other results show that it is difficult to create a spatial structure model for this macroregion, because the results may depend on the viewpoints.

  17. Inhibition of serotonin transport by (+)McN5652 is noncompetitive

    Energy Technology Data Exchange (ETDEWEB)

    Hummerich, Rene [Biochemical Laboratory, Central Institute of Mental Health, 68159 Mannheim (Germany); Schulze, Oliver [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Raedler, Thomas [Department of Psychiatry and Psychotherapy, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Mikecz, Pal [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Reimold, Matthias [Department of Nuclear Medicine, University Hospital Tuebingen, D-72076 Tuebingen (Germany); Brenner, Winfried [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Clausen, Malte [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Schloss, Patrick [Biochemical Laboratory, Central Institute of Mental Health, 68159 Mannheim (Germany); Buchert, Ralph [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany)]. E-mail: buchert@uke.uni-hamburg.de

    2006-04-15

    Introduction: Imaging of the serotonergic innervation of the brain using positron emission tomography (PET) with the serotonin transporter (SERT) ligand [{sup 11C}] (+)McN5652 might be affected by serotonin in the synaptic cleft if there is relevant interaction between [{sup 11}C] (+)McN5652 and serotonin at the SERT. The aim of the present study therefore was to pharmacologically characterize the interaction of [{sup 11}C] (+)McN5652 and serotonin at the SERT. Methods: In vitro saturation analyses of [{sup 3}H]serotonin uptake into HEK293 cells stably expressing the human SERT were performed in the absence and presence of unlabelled (+)McN5652. Data were evaluated assuming Michaelis-Menten kinetics. Results: Unlabelled (+)McN5652 significantly reduced the maximal rate of serotonin transport V {sub max} of SERT without affecting the Michaelis-Menten constant K {sub M}. Conclusions: This finding indicates that (+)McN5652 inhibits serotonin transport through the SERT in a noncompetitive manner. This might suggest that [{sup 11}C] (+)McN5652 PET is not significantly affected by endogenous serotonin.

  18. Inhibition of serotonin transport by (+)McN5652 is noncompetitive

    International Nuclear Information System (INIS)

    Hummerich, Rene; Schulze, Oliver; Raedler, Thomas; Mikecz, Pal; Reimold, Matthias; Brenner, Winfried; Clausen, Malte; Schloss, Patrick; Buchert, Ralph

    2006-01-01

    Introduction: Imaging of the serotonergic innervation of the brain using positron emission tomography (PET) with the serotonin transporter (SERT) ligand [ 11C ] (+)McN5652 might be affected by serotonin in the synaptic cleft if there is relevant interaction between [ 11 C] (+)McN5652 and serotonin at the SERT. The aim of the present study therefore was to pharmacologically characterize the interaction of [ 11 C] (+)McN5652 and serotonin at the SERT. Methods: In vitro saturation analyses of [ 3 H]serotonin uptake into HEK293 cells stably expressing the human SERT were performed in the absence and presence of unlabelled (+)McN5652. Data were evaluated assuming Michaelis-Menten kinetics. Results: Unlabelled (+)McN5652 significantly reduced the maximal rate of serotonin transport V max of SERT without affecting the Michaelis-Menten constant K M . Conclusions: This finding indicates that (+)McN5652 inhibits serotonin transport through the SERT in a noncompetitive manner. This might suggest that [ 11 C] (+)McN5652 PET is not significantly affected by endogenous serotonin

  19. Independent Monte-Carlo dose calculation for MLC based CyberKnife radiotherapy

    Science.gov (United States)

    Mackeprang, P.-H.; Vuong, D.; Volken, W.; Henzen, D.; Schmidhalter, D.; Malthaner, M.; Mueller, S.; Frei, D.; Stampanoni, M. F. M.; Dal Pra, A.; Aebersold, D. M.; Fix, M. K.; Manser, P.

    2018-01-01

    This work aims to develop, implement and validate a Monte Carlo (MC)-based independent dose calculation (IDC) framework to perform patient-specific quality assurance (QA) for multi-leaf collimator (MLC)-based CyberKnife® (Accuray Inc., Sunnyvale, CA) treatment plans. The IDC framework uses an XML-format treatment plan as exported from the treatment planning system (TPS) and DICOM format patient CT data, an MC beam model using phase spaces, CyberKnife MLC beam modifier transport using the EGS++ class library, a beam sampling and coordinate transformation engine and dose scoring using DOSXYZnrc. The framework is validated against dose profiles and depth dose curves of single beams with varying field sizes in a water tank in units of cGy/Monitor Unit and against a 2D dose distribution of a full prostate treatment plan measured with Gafchromic EBT3 (Ashland Advanced Materials, Bridgewater, NJ) film in a homogeneous water-equivalent slab phantom. The film measurement is compared to IDC results by gamma analysis using 2% (global)/2 mm criteria. Further, the dose distribution of the clinical treatment plan in the patient CT is compared to TPS calculation by gamma analysis using the same criteria. Dose profiles from IDC calculation in a homogeneous water phantom agree within 2.3% of the global max dose or 1 mm distance to agreement to measurements for all except the smallest field size. Comparing the film measurement to calculated dose, 99.9% of all voxels pass gamma analysis, comparing dose calculated by the IDC framework to TPS calculated dose for the clinical prostate plan shows 99.0% passing rate. IDC calculated dose is found to be up to 5.6% lower than dose calculated by the TPS in this case near metal fiducial markers. An MC-based modular IDC framework was successfully developed, implemented and validated against measurements and is now available to perform patient-specific QA by IDC.

  20. Homogeneous versus heterogeneous zeolite nucleation

    NARCIS (Netherlands)

    Dokter, W.H.; Garderen, van H.F.; Beelen, T.P.M.; Santen, van R.A.; Bras, W.

    1995-01-01

    Aggregates of fractal dimension were found in the intermediate gel phases that organize prior to nucleation and crystallization (shown right) of silicalite from a homogeneous reaction mixture. Small- and wide-angle X-ray scattering studies prove that for zeolites nucleation may be homogeneous or

  1. Central Sensitization-Based Classification for Temporomandibular Disorders: A Pathogenetic Hypothesis

    Directory of Open Access Journals (Sweden)

    Annalisa Monaco

    2017-01-01

    Full Text Available Dysregulation of Autonomic Nervous System (ANS and central pain pathways in temporomandibular disorders (TMD is a growing evidence. Authors include some forms of TMD among central sensitization syndromes (CSS, a group of pathologies characterized by central morphofunctional alterations. Central Sensitization Inventory (CSI is useful for clinical diagnosis. Clinical examination and CSI cannot identify the central site(s affected in these diseases. Ultralow frequency transcutaneous electrical nerve stimulation (ULFTENS is extensively used in TMD and in dental clinical practice, because of its effects on descending pain modulation pathways. The Diagnostic Criteria for TMD (DC/TMD are the most accurate tool for diagnosis and classification of TMD. However, it includes CSI to investigate central aspects of TMD. Preliminary data on sensory ULFTENS show it is a reliable tool for the study of central and autonomic pathways in TMD. An alternative classification based on the presence of Central Sensitization and on individual response to sensory ULFTENS is proposed. TMD may be classified into 4 groups: (a TMD with Central Sensitization ULFTENS Responders; (b TMD with Central Sensitization ULFTENS Nonresponders; (c TMD without Central Sensitization ULFTENS Responders; (d TMD without Central Sensitization ULFTENS Nonresponders. This pathogenic classification of TMD may help to differentiate therapy and aetiology.

  2. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  3. Detection of Echinococcus multilocularis by MC-PCR: evaluation of diagnostic sensitivity and specificity without gold standard

    Directory of Open Access Journals (Sweden)

    Helene Wahlström

    2016-03-01

    Full Text Available Introduction: A semi-automated magnetic capture probe-based DNA extraction and real-time PCR method (MC-PCR, allowing for a more efficient large-scale surveillance of Echinococcus multilocularis occurrence, has been developed. The test sensitivity has previously been evaluated using the sedimentation and counting technique (SCT as a gold standard. However, as the sensitivity of the SCT is not 1, test characteristics of the MC-PCR was also evaluated using latent class analysis, a methodology not requiring a gold standard. Materials and methods: Test results, MC-PCR and SCT, from a previous evaluation of the MC-PCR using 177 foxes shot in the spring (n=108 and autumn 2012 (n=69 in high prevalence areas in Switzerland were used. Latent class analysis was used to estimate the test characteristics of the MC-PCR. Although it is not the primary aim of this study, estimates of the test characteristics of the SCT were also obtained. Results and discussion: This study showed that the sensitivity of the MC-PCR was 0.88 [95% posterior credible interval (PCI 0.80–0.93], which was not significantly different than the SCT, 0.83 (95% PCI 0.76–0.88, which is currently considered as the gold standard. The specificity of both tests was high, 0.98 (95% PCI 0.94–0.99 for the MC-PCR and 0.99 (95% PCI 0.99–1 for the SCT. In a previous study, using fox scats from a low prevalence area, the specificity of the MC-PCR was higher, 0.999% (95% PCI 0.997–1. One reason for the lower estimate of the specificity in this study could be that the MC-PCR detects DNA from infected but non-infectious rodents eaten by foxes. When using MC-PCR in low prevalence areas or areas free from the parasite, a positive result in the MC-PCR should be regarded as a true positive. Conclusion: The sensitivity of the MC-PCR (0.88 was comparable to the sensitivity of SCT (0.83.

  4. Poster — Thur Eve — 61: A new framework for MPERT plan optimization using MC-DAO

    Energy Technology Data Exchange (ETDEWEB)

    Baker, M; Lloyd, S AM; Townson, R [University of Victoria, Victoria, British Columbia (Canada); Bush, K [Department of Physics, Stanford University, Palo Alto, CA (United States); Gagne, I M; Zavgorodni, S [Department of Medical Physics, British Columbia Cancer Agency—Vancouver Island Center, Victoria, British Columbia (Canada)

    2014-08-15

    This work combines the inverse planning technique known as Direct Aperture Optimization (DAO) with Intensity Modulated Radiation Therapy (IMRT) and combined electron and photon therapy plans. In particular, determining conditions under which Modulated Photon/Electron Radiation Therapy (MPERT) produces better dose conformality and sparing of organs at risk than traditional IMRT plans is central to the project. Presented here are the materials and methods used to generate and manipulate the DAO procedure. Included is the introduction of a powerful Java-based toolkit, the Aperture-based Monte Carlo (MC) MPERT Optimizer (AMMO), that serves as a framework for optimization and provides streamlined access to underlying particle transport packages. Comparison of the toolkit's dose calculations to those produced by the Eclipse TPS and the demonstration of a preliminary optimization are presented as first benchmarks. Excellent agreement is illustrated between the Eclipse TPS and AMMO for a 6MV photon field. The results of a simple optimization shows the functioning of the optimization framework, while significant research remains to characterize appropriate constraints.

  5. Delimitation of homogeneous regions in the UNIFESP/EPM healthcare center coverage area based on sociodemographic indicators

    Directory of Open Access Journals (Sweden)

    Karina Yuri Harada

    1999-01-01

    Full Text Available CONTEXT: The drawing up of adequate Public Health action planning to address the true needs of the population would increase the chances of effectiveness and decrease unnecessary expenses. OBJECTIVE: To identify homogeneous regions in the UNIFESP/EPM healthcare center (HCC coverage area based on sociodemographic indicators and to relate them to causes of deaths in 1995. DESIGN: Secondary data analysis. SETTING: HCC coverage area; primary care. SAMPLE: Sociodemographic indicators were obtained from special tabulations of the Demographic Census of 1991. MAIN MEASURES: Proportion of children and elderly in the population; family providers’ education level (maximum: >15 years, minimum: 20 minimum wages, minimum: <1 minimum wage; proportional mortality distribution. RESULTS: The maximum income permitted the construction of four homogeneous regions, according to income ranking. Although the proportion of children and of elderly did not vary significantly among the regions, minimum income and education showed a statistically significant (p<0.05 difference between the first region (least affluent and the others. A clear trend of increasing maximum education was observed across the regions. Mortality also differed in the first region, with deaths generated by possibly preventable infections. CONCLUSION: The inequalities observed may contribute to primary health prevention.

  6. Delimitation of homogeneous regions in the UNIFESP/EPM healthcare center coverage area based on sociodemographic indicators.

    Science.gov (United States)

    Harada, K Y; Silva, J G; Schenkman, S; Hayama, E T; Santos, F R; Prado, M C; Pontes, R H

    1999-01-07

    The drawing up of adequate Public Health action planning to address the true needs of the population would increase the chances of effectiveness and decrease unnecessary expenses. To identify homogeneous regions in the UNIFESP/EPM healthcare center (HCC) coverage area based on sociodemographic indicators and to relate them to causes of deaths in 1995. Secondary data analysis. HCC coverage area; primary care. Sociodemographic indicators were obtained from special tabulations of the Demographic Census of 1991. Proportion of children and elderly in the population; family providers' education level (maximum: > 15 years, minimum: 20 minimum wages, minimum: < 1 minimum wage); proportional mortality distribution The maximum income permitted the construction of four homogeneous regions, according to income ranking. Although the proportion of children and of elderly did not vary significantly among the regions, minimum income and education showed a statistically significant (p < 0.05) difference between the first region (least affluent) and the others. A clear trend of increasing maximum education was observed across the regions. Mortality also differed in the first region, with deaths generated by possibly preventable infections. The inequalities observed may contribute to primary health prevention.

  7. A second stage homogenization method

    International Nuclear Information System (INIS)

    Makai, M.

    1981-01-01

    A second homogenization is needed before the diffusion calculation of the core of large reactors. Such a second stage homogenization is outlined here. Our starting point is the Floquet theorem for it states that the diffusion equation for a periodic core always has a particular solution of the form esup(j)sup(B)sup(x) u (x). It is pointed out that the perturbation series expansion of function u can be derived by solving eigenvalue problems and the eigenvalues serve to define homogenized cross sections. With the help of these eigenvalues a homogenized diffusion equation can be derived the solution of which is cos Bx, the macroflux. It is shown that the flux can be expressed as a series of buckling. The leading term in this series is the well known Wigner-Seitz formula. Finally three examples are given: periodic absorption, a cell with an absorber pin in the cell centre, and a cell of three regions. (orig.)

  8. Official portrait of Astronaut Ronald E. McNair

    Science.gov (United States)

    1985-01-01

    Official portrait of Astronaut Ronald E. McNair. McNair is in the blue shuttle flight suit, standing in front of a table which holds a model of the Space Shuttle. An American flag is visible behind him.

  9. Sewage sludge solubilization by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  10. Development, validation, and factorial comparison of the McGill Self-Efficacy of Learners For Inquiry Engagement (McSELFIE) survey in natural science disciplines

    Science.gov (United States)

    Ibrahim, Ahmed; Aulls, Mark W.; Shore, Bruce M.

    2016-11-01

    Sociocognitive theory [Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall; Bandura, A. (1989). Human agency in social cognitive theory. American Psychologist, 44, 1175-1184. doi:10.1037/0003-066x.44.9.1175; Bandura, A. (1991). Social cognitive theory of self-regulation. Organizational Behavior and Human Decision Processes, 50, 248-287. doi:10.1016/0749-5978(91)90022-L] accords high importance to the mechanisms of human agency and how they are exercised through self-efficacy. In this paper, we developed and validated the McGill Self-Efficacy For Inquiry Engagement (McSELFIE) instrument with undergraduate students in natural science disciplines. We defined inquiry engagement as carrying out the practices of science (POS) that are supported by students' personality characteristics (SPCs) and that result in achieving inquiry-learning outcomes (ILOs). Based on these theoretical perspectives, the McSELFIE is a 60-item, learner-focused survey that addresses three components that are theoretically important for engaging in scientific inquiry: (a) SPCs, (b) ILOs, and (c) POS. Evidence for construct and content validity were obtained by using experts' judgments and confirmatory factor analysis with a sample of 110 undergraduate students enrolled in science disciplines. Internal consistency of the factors and instrument was also examined. The McSELFIE instrument is a reliable and valid instrument for measuring science undergraduate students' self-efficacy for inquiry engagement. Matched pairs analyses were conducted among the instruments' factors. Students reported the highest self-efficacy for openness, applying knowledge, and carrying out investigations. Students reported the lowest self-efficacy for extraversion, understanding metacognitive knowledge, and planning investigations. Theoretical and practical implications are discussed.

  11. Incomplete McCune-Albright Syndrome: A Case Report

    Directory of Open Access Journals (Sweden)

    Nagehan Aslan

    2014-08-01

    Full Text Available Fibrous dysplasia of bone is a genetic, non-inheritable disease that can cause bone pain, bone deformities and fracture. It has a large clinic spectrum from benign monostotic fibrous dysplasia to McCune-Albright syndrome. Rare McCune-Albright syndrome is characterized by precocious puberty, cafe au lait spots and fibrous dysplasia. Herein we presented a case who was preferred to hospital with pathological fractures and diagnosed with Incomplet McCune Albright syndrome because of the lack of endocrine hyperfunction and developed early puberty at clinical course.

  12. Mutations in MC1R Gene Determine Black Coat Color Phenotype in Chinese Sheep

    Directory of Open Access Journals (Sweden)

    Guang-Li Yang

    2013-01-01

    Full Text Available The melanocortin receptor 1 (MC1R plays a central role in regulation of animal coat color formation. In this study, we sequenced the complete coding region and parts of the 5′- and 3′-untranslated regions of the MC1R gene in Chinese sheep with completely white (Large-tailed Han sheep, black (Minxian Black-fur sheep, and brown coat colors (Kazakh Fat-Rumped sheep. The results showed five single nucleotide polymorphisms (SNPs: two non-synonymous mutations previously associated with coat color (c.218 T>A, p.73 Met>Lys. c.361 G>A, p.121 Asp>Asn and three synonymous mutations (c.429 C>T, p.143 Tyr>Tyr; c.600 T>G, p.200 Leu>Leu. c.735 C>T, p.245 Ile>Ile. Meanwhile, all mutations were detected in Minxian Black-fur sheep. However, the two nonsynonymous mutation sites were not in all studied breeds (Large-tailed Han, Small-tailed Han, Gansu Alpine Merino, and China Merino breeds, all of which are in white coat. A single haplotype AATGT (haplotype3 was uniquely associated with black coat color in Minxian Black-fur breed (P=9.72E-72, chi-square test. The first and second A alleles in this haplotype 3 represent location at 218 and 361 positions, respectively. Our results suggest that the mutations of MC1R gene are associated with black coat color phenotype in Chinese sheep.

  13. McJobs and Pieces of Flair: Linking McDonaldization to Alienating Work

    Science.gov (United States)

    Treiber, Linda Ann

    2013-01-01

    This article offers strategies for teaching about rationality, bureaucracy, and social change using George Ritzer's "The McDonaldization of Society" and its ideas about efficiency, predictability, calculability, and control. Student learning is facilitated using a series of strategies: making the familiar strange, explaining…

  14. Imaging of the brain serotonin transporters (SERT) with {sup 18}F-labelled fluoromethyl-McN5652 and PET in humans

    Energy Technology Data Exchange (ETDEWEB)

    Hesse, Swen [University of Leipzig, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Center, AdiposityDiseases, Leipzig (Germany); Brust, Peter [Helmholtz-Zentrum Dresden-Rossendorf, Institute of Radiopharmacy, Research Site Leipzig, Leipzig (Germany); Maeding, Peter; Zessin, Joerg; Fuechtner, Frank [Helmholtz-Zentrum Dresden-Rossendorf, Institute of Radiopharmacy, Dresden (Germany); Becker, Georg-Alexander; Patt, Marianne; Seese, Anita; Sorger, Dietlind; Meyer, Philipp M.; Habermann, Bernd; Luthardt, Julia; Bresch, Anke; Sabri, Osama [University of Leipzig, Department of Nuclear Medicine, Leipzig (Germany); Lobsien, Donald [University of Leipzig, Department of Neuroradiology, Leipzig (Germany); Laudi, Sven [University of Leipzig, Department of Anaesthesiology and Intensive Care, Leipzig (Germany); Steinbach, Joerg [Helmholtz-Zentrum Dresden-Rossendorf, Institute of Radiopharmacy, Dresden (Germany); Helmholtz-Zentrum Dresden-Rossendorf, Institute of Radiopharmacy, Research Site Leipzig, Leipzig (Germany)

    2012-06-15

    [{sup 11}C]DASB is currently the most frequently used highly selective radiotracer for visualization and quantification of central SERT. Its use, however, is hampered by the short half-life of {sup 11}C, the moderate cortical test-retest reliability, and the lack of quantifying endogenous serotonin. Labelling with {sup 18}F allows in principle longer acquisition times for kinetic analysis in brain tissue and may provide higher sensitivity. The aim of our study was to firstly use the new highly SERT-selective {sup 18}F-labelled fluoromethyl analogue of (+)-McN5652 ((+)-[{sup 18}F]FMe-McN5652) in humans and to evaluate its potential for SERT quantification. The PET data from five healthy volunteers (three men, two women, age 39 {+-} 10 years) coregistered with individual MRI scans were semiquantitatively assessed by volume-of-interest analysis using the software package PMOD. Rate constants and total distribution volumes (V{sub T}) were calculated using a two-tissue compartment model and arterial input function measurements were corrected for metabolite/plasma data. Standardized uptake region-to-cerebellum ratios as a measure of specific radiotracer accumulation were compared with those of a [{sup 11}C]DASB PET dataset from 21 healthy subjects (10 men, 11 women, age 38 {+-} 8 years). The two-tissue compartment model provided adequate fits to the data. Estimates of total distribution volume (V{sub T}) demonstrated good identifiability based on the coefficients of variation (COV) for the volumes of interest in SERT-rich and cortical areas (COV V{sub T} <10%). Compared with [{sup 11}C]DASB PET, there was a tendency to lower mean uptake values in (+)-[{sup 18}F]FMe-McN5652 PET; however, the standard deviation was also somewhat lower. Altogether, cerebral (+)-[{sup 18}F]FMe-McN5652 uptake corresponded well with the known SERT distribution in humans. The results showed that (+)-[{sup 18}F]FMe-McN5652 is also suitable for in vivo quantification of SERT with PET. Because of

  15. Degradation of corn stalk by the composite microbial system of MC1

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The composite microbial system of MC1 was used to degrade corn stalk in order to determine properties of the degraded products as well as bacterial composition of MC1. Results indicated that the pH of the fermentation broth was typical of lignocellulose degradatioin by MC1, decreasing in the early phase and increasing in later stages of the degradation. The microbial biomass peaked on the day 3 after degradation. The MC1 effeciently degraded the corn stalk by nearly 70% during which its cellulose content decreased by 71.2%, hemicellulose by 76.5% and lignin by 24.6%. The content of water-soluble carbohydrates (WSC) in the fermentation broth increased progressively during the first three days, and decreased thereafter, suggesting an accumulation of WSC in the early phase of the degradation process. Total levels of various volatile products peaked in the third day after degradation , and 7 types of volatile products were detected in the fermentation broth. These were ethanol, acetic acid, 1,2-ethanediol, propanoic acid, butanoic acid, 3-methyl-butanoic acid and glycerine. Six major compounds were quantitatively analysed and the contents of each compound were ethanol (0.584 g/L), acetic acid (0.735 g/L), 1,2-ethanediol (0.772 g/L), propanoic acid (0.026 g/L), butanoic acid (0.018 g/L) and glycerine (4.203 g/L). Characterization of bacterial cells collected from the culture solution, based on 16S rDNA PCR-DGGE analysis of DNAs, showed that the composition of bacterial community in MC1 coincided basically with observations from previous studies. This indicated that the structure of MC1 is very stable during degradation of different lignocellulose materials.

  16. Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær

    2013-01-01

    . The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide and by using newly......Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed[1]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides...... developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates...

  17. M&C ML: A modeling language for monitoring and control systems

    Energy Technology Data Exchange (ETDEWEB)

    Patwari, Puneet, E-mail: patwari.puneet@tcs.com; Chaudhuri, Subhrojyoti Roy; Natarajan, Swaminathan; Muralikrishna, G

    2016-11-15

    Highlights: • It is challenging to maintain consistency in the current approach to M&C design. • Based on similarity across various projects, it looks ideal to propose a solution at domain level. • Approach to create a DSL for M&C involves viewing a system through lenses of various domains. • M&CML provides a standard vocabulary and the entire process of M&C solution creation domain-aware. • M&CML provides a holistic view of control architecture. • M&CML has support for inherent consistency checks, user assistance and third party support. - Abstract: The use of System Engineering (SE) language such as SysML [1,20] is common within the community of control system designers. However the design handoff to the subsequent phases of the control system development is carried out manually in most cases without much tool support. The approach to agreeing on the control interface between components is a good example where engineers still rely on either manually created Interface Control Documents (ICD) or one off tools implemented by individual projects. Square Kilometer Array (SKA) [2] and International Thermonuclear Experimental Reactor (ITER) [3] are two good examples of such large projects adopting these approaches. This results in non-uniformity in the overall system design since individual groups invent their own vocabulary while using a language like SysML which leads to inconsistencies across the design, interface and realized code. To mitigate this, we propose the development of a Monitoring and Control Modeling Language (M&CML), a domain specific language (DSL) [4,22] for specifying M&C solutions. M&C ML starts with defining a vocabulary borrowing concepts from standard practices used in the control domain and incorporates a language which ensures uniformity and consistency across the M&C design, interfaces and implementation artifacts. In this paper we discuss this language with an analysis of its usage to point out its benefits.

  18. M&C ML: A modeling language for monitoring and control systems

    International Nuclear Information System (INIS)

    Patwari, Puneet; Chaudhuri, Subhrojyoti Roy; Natarajan, Swaminathan; Muralikrishna, G

    2016-01-01

    Highlights: • It is challenging to maintain consistency in the current approach to M&C design. • Based on similarity across various projects, it looks ideal to propose a solution at domain level. • Approach to create a DSL for M&C involves viewing a system through lenses of various domains. • M&CML provides a standard vocabulary and the entire process of M&C solution creation domain-aware. • M&CML provides a holistic view of control architecture. • M&CML has support for inherent consistency checks, user assistance and third party support. - Abstract: The use of System Engineering (SE) language such as SysML [1,20] is common within the community of control system designers. However the design handoff to the subsequent phases of the control system development is carried out manually in most cases without much tool support. The approach to agreeing on the control interface between components is a good example where engineers still rely on either manually created Interface Control Documents (ICD) or one off tools implemented by individual projects. Square Kilometer Array (SKA) [2] and International Thermonuclear Experimental Reactor (ITER) [3] are two good examples of such large projects adopting these approaches. This results in non-uniformity in the overall system design since individual groups invent their own vocabulary while using a language like SysML which leads to inconsistencies across the design, interface and realized code. To mitigate this, we propose the development of a Monitoring and Control Modeling Language (M&CML), a domain specific language (DSL) [4,22] for specifying M&C solutions. M&C ML starts with defining a vocabulary borrowing concepts from standard practices used in the control domain and incorporates a language which ensures uniformity and consistency across the M&C design, interfaces and implementation artifacts. In this paper we discuss this language with an analysis of its usage to point out its benefits.

  19. Comparison of different homogenization approaches for elastic–viscoplastic materials

    International Nuclear Information System (INIS)

    Mercier, S; Molinari, A; Berbenni, S; Berveiller, M

    2012-01-01

    Homogenization of linear viscoelastic and non-linear viscoplastic composite materials is considered in this paper. First, we compare two homogenization schemes based on the Mori–Tanaka method coupled with the additive interaction (AI) law proposed by Molinari et al (1997 Mech. Mater. 26 43–62) or coupled with a concentration law based on translated fields (TF) originally proposed for the self-consistent scheme by Paquin et al (1999 Arch. Appl. Mech. 69 14–35). These methods are also evaluated against (i) full-field calculations of the literature based on the finite element method and on fast Fourier transform, (ii) available analytical exact solutions obtained in linear viscoelasticity and (iii) homogenization methods based on variational approaches. Developments of the AI model are obtained for linear and non-linear material responses while results for the TF method are shown for the linear case. Various configurations are considered: spherical inclusions, aligned fibers, hard and soft inclusions, large material contrasts between phases, volume-preserving versus dilatant anelastic flow, non-monotonic loading. The agreement between the AI and TF methods is excellent and the correlation with full field calculations is in general of quite good quality (with some exceptions for non-linear composites with a large volume fraction of very soft inclusions for which a discrepancy of about 15% was found for macroscopic stress). Description of the material behavior with internal variables can be accounted for with the AI and TF approaches and therefore complex loadings can be easily handled in contrast with most hereditary approaches. (paper)

  20. A literature review on biotic homogenization

    OpenAIRE

    Guangmei Wang; Jingcheng Yang; Chuangdao Jiang; Hongtao Zhao; Zhidong Zhang

    2009-01-01

    Biotic homogenization is the process whereby the genetic, taxonomic and functional similarity of two or more biotas increases over time. As a new research agenda for conservation biogeography, biotic homogenization has become a rapidly emerging topic of interest in ecology and evolution over the past decade. However, research on this topic is rare in China. Herein, we introduce the development of the concept of biotic homogenization, and then discuss methods to quantify its three components (...

  1. Osteoarthritic cartilage is more homogeneous than healthy cartilage

    DEFF Research Database (Denmark)

    Qazi, Arish A; Dam, Erik B; Nielsen, Mads

    2007-01-01

    it evolves as a consequence to disease and thereby can be used as a progression biomarker. MATERIALS AND METHODS: A total of 283 right and left knees from 159 subjects aged 21 to 81 years were scanned using a Turbo 3D T1 sequence on a 0.18-T MRI Esaote scanner. The medial compartment of the tibial cartilage...... sheet was segmented using a fully automatic voxel classification scheme based on supervised learning. From the segmented cartilage sheet, homogeneity was quantified by measuring entropy from the distribution of signal intensities inside the compartment. Each knee was examined by radiography...... of the region was evaluated by testing for overfitting. Three different regularization techniques were evaluated for reducing overfitting errors. RESULTS: The P values for separating the different groups based on cartilage homogeneity were 2 x 10(-5) (KL 0 versus KL 1) and 1 x 10(-7) (KL 0 versus KL >0). Using...

  2. The Travails of Criticality: Understanding Peter McLaren's Revolutionary Vocation. An Article Review of Peter McLaren, "Pedagogy of Insurrection" (New York: Peter Lang, 2015)

    Science.gov (United States)

    Baldacchino, John

    2017-01-01

    This is an article review of Peter McLaren's "Pedagogy of Insurrection" (New York: Peter Lang, 2015). While it seeks to position McLaren's work within the context of critical pedagogy, this paper also assesses McLaren from the wider discussion of Marxist--Hegelian discourse as it evolved within the Left. Engaging with McLaren critically,…

  3. Rapid simultaneous high-resolution mapping of myelin water fraction and relaxation times in human brain using BMC-mcDESPOT.

    Science.gov (United States)

    Bouhrara, Mustapha; Spencer, Richard G

    2017-02-15

    A number of central nervous system (CNS) diseases exhibit changes in myelin content and magnetic resonance longitudinal, T 1 , and transverse, T 2 , relaxation times, which therefore represent important biomarkers of CNS pathology. Among the methods applied for measurement of myelin water fraction (MWF) and relaxation times, the multicomponent driven equilibrium single pulse observation of T 1 and T 2 (mcDESPOT) approach is of particular interest. mcDESPOT permits whole brain mapping of multicomponent T 1 and T 2 , with data acquisition accomplished within a clinically realistic acquisition time. Unfortunately, previous studies have indicated the limited performance of mcDESPOT in the setting of the modest signal-to-noise range of high-resolution mapping, required for the depiction of small structures and to reduce partial volume effects. Recently, we showed that a new Bayesian Monte Carlo (BMC) analysis substantially improved determination of MWF from mcDESPOT imaging data. However, our previous study was limited in that it did not discuss determination of relaxation times. Here, we extend the BMC analysis to the simultaneous determination of whole-brain MWF and relaxation times using the two-component mcDESPOT signal model. Simulation analyses and in-vivo human brain studies indicate the overall greater performance of this approach compared to the stochastic region contraction (SRC) algorithm, conventionally used to derive parameter estimates from mcDESPOT data. SRC estimates of the transverse relaxation time of the long T 2 fraction, T 2,l , and the longitudinal relaxation time of the short T 1 fraction, T 1,s , clustered towards the lower and upper parameter search space limits, respectively, indicating failure of the fitting procedure. We demonstrate that this effect is absent in the BMC analysis. Our results also showed improved parameter estimation for BMC as compared to SRC for high-resolution mapping. Overall we find that the combination of BMC analysis

  4. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  5. Improving homogeneity by dynamic speed limit systems.

    NARCIS (Netherlands)

    Nes, N. van Brandenberg, S. & Twisk, D.A.M.

    2010-01-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12

  6. Imaging of the central skull base.

    Science.gov (United States)

    Borges, Alexandra

    2009-11-01

    The central skull base (CSB) constitutes a frontier between the extracranial head and neck and the middle cranial fossa. The anatomy of this region is complex, containing most of the bony foramina and canals of the skull base traversed by several neurovascular structures that can act as routes of spread for pathologic processes. Lesions affecting the CSB can be intrinsic to its bony-cartilaginous components; can arise from above, within the intracranial compartment; or can arise from below, within the extracranial head and neck. Crosssectional imaging is indispensable in the diagnosis, treatment planning, and follow-up of patients with CSB lesions. This review focuses on a systematic approach to this region based on an anatomic division that takes into account the major tissue constituents of the CSB.

  7. Mechanized syringe homogenization of human and animal tissues.

    Science.gov (United States)

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  8. Homogeneity and thermodynamic identities in geometrothermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)

    2017-03-15

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)

  9. A McCollough Effect Generated at Binocular Site

    Directory of Open Access Journals (Sweden)

    Qiujie Weng

    2011-05-01

    Full Text Available Following exposures to alternating gratings with unique combination of orientation and colors, an achromatic grating would appear tinted with its perceived color contingent on the grating's orientation. This orientation-contingent color after effect is called the McCollough effect. The lack of interocular transfer of the McCollough effect suggests that the McCollough effect is primarily established in monocular channels. Here we explored the possibility that the McCollough effect can be induced at a binocular site. During adaptation, a red vertical grating and a green horizontal grating are dichoptically presented to the two eyes. In the ‘binocular rivalry’ condition, these two gratings were constantly presented throughout the adaptation duration and subjects experienced the rivalry between the two gratings. In the ‘physical alternation’ condition, the two dichoptic gratings physically alternated during adaptation, perceptually similar to binocular rivalry. Interestingly, following dichoptic adaptation either in the rivalry condition or in the physical alternation condition, a binocularly viewed achromatic test grating appeared colored depending on its orientation: a vertical grating appeared greenish and a horizontal grating pinkish. In other words, we observed a McCollough effect following dichoptic adaptation, which can only be explained by a binocular site of orientation-contingent color adaptation.

  10. [Revision of McDonald's new diagnostic criteria for multiple sclerosis].

    Science.gov (United States)

    Wiendl, H; Kieseier, B C; Gold, R; Hohlfeld, R; Bendszus, M; Hartung, H-P

    2006-10-01

    In 2001, an international panel suggested new diagnostic criteria for multiple sclerosis (MS). These criteria integrate clinical, imaging (MRI), and paraclinical results in order to facilitate diagnosis. Since then, these so-called McDonald criteria have been broadly accepted and widely propagated. In the meantime a number of publications have dealt with the sensitivity and specificity for MS diagnosis and with implementing these new criteria in clinical practice. Based on these empirical values and newer data on MS, an international expert group recently proposed a revision of the criteria. Substantial changes affect (1) MRI criteria for the dissemination of lesions over time, (2) the role of spinal cord lesions in the MRI and (3) diagnosis of primary progressive MS. In this article we present recent experiences with the McDonald and revised criteria.

  11. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    Science.gov (United States)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  12. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Qin, N; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States); Peeler, C [UT MD Anderson Cancer Center, Houston, TX (United States); Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relative Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.

  13. Corrections to O(α7(lnα)mc2) fine-structure splittings and O(α6(lnα)mc2) energy levels in helium

    International Nuclear Information System (INIS)

    Zhang, T.

    1996-01-01

    Fully relativistic formulas for the energy-level shifts arising from no-pair exchange diagrams of two transverse photons plus an arbitrary number of Coulomb photons are derived in closed form within the external potential Bethe-Salpeter formalism. O(α 7 (lnα)mc 2 ) corrections to the fine-structure splittings of helium are obtained and expressed in terms of expectation values of nonrelativistic operators. O(α 7 mc 2 ) operators from exchange diagrams are found in nonrelativistic approximation. O(α 6 m 2 c 2 /M) nucleus-electron operators contributing to the fine-structure splittings are derived. Nonrelativistic operators of O(α 6 mc 2 ) corrections to the triplet levels of helium are presented. Nonrelativistic operators of O(α 6 (lnα)mc 2 ) corrections to the helium singlet levels and to positronium S levels are derived. O(α 6 m 2 c 2 /M) hydrogen and O(α 6 mc 2 ) positronium P levels, and O(α 6 (lnα)mc 2 ) corrections of first order to positronium S levels, are calculated using the derived operators for helium, in agreement with those obtained previously by others, except for one term in corrections to positronium P levels. In addition, the O(α 6 mc 2 ) Dirac energies for hydrogenic non-S levels are exactly reproduced in a perturbative calculation. copyright 1996 The American Physical Society

  14. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) of the rare earth elements (REEs) in beneficiation rare earth waste from the gold processing: case study

    Science.gov (United States)

    Bieda, Bogusław; Grzesik, Katarzyna

    2017-11-01

    The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.

  15. Homogeneity of Prototypical Attributes in Soccer Teams

    Directory of Open Access Journals (Sweden)

    Christian Zepp

    2015-09-01

    Full Text Available Research indicates that the homogeneous perception of prototypical attributes influences several intragroup processes. The aim of the present study was to describe the homogeneous perception of the prototype and to identify specific prototypical subcategories, which are perceived as homogeneous within sport teams. The sample consists of N = 20 soccer teams with a total of N = 278 athletes (age M = 23.5 years, SD = 5.0 years. The results reveal that subcategories describing the cohesiveness of the team and motivational attributes are mentioned homogeneously within sport teams. In addition, gender, identification, team size, and the championship ranking significantly correlate with the homogeneous perception of prototypical attributes. The results are discussed on the basis of theoretical and practical implications.

  16. Ecological effects of contaminants in McCoy Branch, 1989-1990

    Energy Technology Data Exchange (ETDEWEB)

    Ryon, M.G. (ed.)

    1992-01-01

    The 1984 Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act (RCRA) required assessment of all current and former solid waste management units. Such a RCRA Facility Investigation (RFI) was required of the Y-12 Plant for their Filled Coal Ash Pond on McCoy Branch. Because the disposal of coal ash in the ash pond, McCoy Branch, and Rogers Quarry was not consistent with the Tennessee Water Quality Act, several remediation steps were implemented or planned for McCoy Branch to address disposal problems. The McCoy Branch RFI plan included provisions for biological monitoring of the McCoy Branch watershed. The objectives of the biological monitoring were to: (1) document changes in biological quality of McCoy Branch after completion of a pipeline and after termination of all discharges to Rogers Quarry, (2) provide guidance on the need for additional remediation, and (3) evaluate the effectiveness of implemented remedial actions. The data from the biological monitoring program will also determine if the classified uses, as identified by the State of Tennessee, of McCoy Branch are being protected and maintained. This report discusses results from toxicity monitoring of snails fish community assessment, and a Benthic macroinvertebrate community assessment.

  17. Ground based observations of Pc3-Pc5 geomagnetic pulsation power at Antarctic McMurdo station

    Directory of Open Access Journals (Sweden)

    C. G. Maclennan

    1998-06-01

    Full Text Available The two horizontal geomagnetic components and, measured by a fluxgate magnetometer at Antarctic McMurdo station (corrected geomagnetic coordinates 80.0° S, 327.5° E, are analyzed for the period May-June 1994; the spectral powers are calculated and integrated over three frequency intervals corresponding to the nominal ranges. The time dependence of those integrated powers and their correlations with northern auroral indices and solar wind speed are considered. The observations are compared with previous results reported from Terra Nova Bay station (located near McMurdo at the same corrected geomagnetic latitude during Antarctic summer intervals. The differences found between the two stations are discussed in terms of the seasonal dependence of geomagnetic field line configurations in the near cusp region.

  18. Discovery of Mixed Pharmacology Melanocortin-3 Agonists and Melanocortin-4 Receptor Tetrapeptide Antagonist Compounds (TACOs) Based on the Sequence Ac-Xaa1-Arg-(pI)DPhe-Xaa4-NH2.

    Science.gov (United States)

    Doering, Skye R; Freeman, Katie T; Schnell, Sathya M; Haslach, Erica M; Dirain, Marvin; Debevec, Ginamarie; Geer, Phaedra; Santos, Radleigh G; Giulianotti, Marc A; Pinilla, Clemencia; Appel, Jon R; Speth, Robert C; Houghten, Richard A; Haskell-Luevano, Carrie

    2017-05-25

    The centrally expressed melanocortin-3 and -4 receptors (MC3R/MC4R) have been studied as possible targets for weight management therapies, with a preponderance of studies focusing on the MC4R. Herein, a novel tetrapeptide scaffold [Ac-Xaa 1 -Arg-(pI)DPhe-Xaa 4 -NH 2 ] is reported. The scaffold was derived from results obtained from a MC3R mixture-based positional scanning campaign. From these results, a set of 48 tetrapeptides were designed and pharmacologically characterized at the mouse melanocortin-1, -3, -4, and -5 receptors. This resulted in the serendipitous discovery of nine compounds that were MC3R agonists (EC 50 DPhe-Tic-NH 2 ], 1 [Ac-His-Arg-(pI)DPhe-Tic-NH 2 ], and 41 [Ac-Arg-Arg-(pI)DPhe-DNal(2')-NH 2 ] were more potent (EC 50 DPhe-Arg-Trp-NH 2 . This template contains a sequentially reversed "Arg-(pI)DPhe" motif with respect to the classical "Phe-Arg" melanocortin signaling motif, which results in pharmacology that is first-in-class for the central melanocortin receptors.

  19. Anatomical characterization of central, apical and minimal corneal thickness

    Directory of Open Access Journals (Sweden)

    Federico Saenz-Frances

    2014-08-01

    Full Text Available AIM: To anatomically locate the points of minimum corneal thickness and central corneal thickness (pupil center in relation to the corneal apex.METHODS: Observational, cross-sectional study, 299 healthy volunteers. Thickness at the corneal apex (AT, minimum corneal thickness (MT and corneal thickness at the pupil center (PT were determined using the pentacam. Distances from the corneal apex to MT (MD and PT (PD were calculated and their quadrant position (taking the corneal apex as the reference determined:point of minimum thickness (MC and point of central thickness (PC depending on the quadrant position. Two multivariate linear regression models were constructed to examine the influence of age, gender, power of the flattest and steepest corneal axes, position of the flattest axis, corneal volume (determined using the Pentacam and PT on MD and PD. The effects of these variables on MC and PC were also determined in two multinomial regression models.RESULTS: MT was located at a mean distance of 0.909 mm from the apex (79.4% in the inferior-temporal quadrant. PT was located at a mean distance of 0.156 mm from the apex. The linear regression model for MD indicated it was significantly influenced by corneal volume (B=-0.024; 95%CI:-0.043 to -0.004. No significant relations were identified in the linear regression model for PD or the multinomial logistic regressions for MC and PC.CONCLUSION: MT was typically located at the inferior-temporal quadrant of the cornea and its distance to the corneal apex tended to decrease with the increment of corneal volume.

  20. Homogenization theory in reactor lattices

    International Nuclear Information System (INIS)

    Benoist, P.

    1986-02-01

    The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr

  1. McArdle disease: a case report and review

    Directory of Open Access Journals (Sweden)

    Leite A

    2012-01-01

    Full Text Available Alberto Leite, Narciso Oliveira, Manuela RochaInternal Medicine Department, Hospital de Braga, PortugalAbstract: McArdle disease (glycogen storage disease type V is a pure myopathy caused by an inherited deficit of myophosphorylase. The disease exhibits clinical heterogeneity, but patients typically experience exercise intolerance, acute crises of early fatigue, and contractures, sometimes with rhabdomyolysis and myoglobinuria, triggered by static muscle contractions or dynamic exercise. We present the case of a 54-year-old man with a lifelong history of fatigability, worsening on exertion. Laboratory evaluation revealed significant elevations in levels of creatine kinase (7924 U/L, lactate dehydrogenase (624 U/L, and myoglobulin (671 ng/mL. A muscle biopsy confirmed the presence of McArdle disease. This case report illustrates how, due to embarrassment, the patient hid his symptoms for many years and was eventually extremely relieved and “liberated” once McArdle disease was diagnosed 40 years later.Keywords: McArdle disease, glycogen storage disease, myophosphorylase

  2. Study of the characteristics of forced homogeneous turbulence using band-pass Fourier filtering

    Energy Technology Data Exchange (ETDEWEB)

    Kareem, Waleed Abdel [Suez Canal University, Suez (Egypt)

    2012-03-15

    Simulations of forced homogeneous isotropic turbulence with resolutions of 128{sup 3} and 256{sup 3} using the Lattice Boltzmann method are carried out. The multi-scale vortical structures are identified using the band-pass Fourier cutoff filtering. Three fields at each simulation are extracted and their characteristics are studied. The vortical structures are visualized using the Q-identification method. A new lattice segmentation scheme to identify the central axes of the vortical structures is introduced. The central points of each vortex are identified and they are connected using the direction cosines technique. Results show that the Q-spectrum of the fine scale field survives at low and high wave-numbers. However, the large and intermediate Q-spectra survives till wave-numbers less than or equal to twice the used velocity cutoff wave-numbers. It is found that the extracted central axes clearly resemble the corresponding vortical structures at each scale. Using the central axes scheme, the radii and lengths of the vortical structures at each scale are determined and compared. It is also found that the radii of the identified vortical structures at each scale in both simulations are of the order of several times the Kolmogorov microscales.

  3. Personal Background Interview of Jim McBarron

    Science.gov (United States)

    McBarron, Jim; Wright, Rebecca

    2012-01-01

    Jim McBarron exhibits a wealth of knowledge gathered from more than 40 years of experience with NASA, EVA, and spacesuits. His biography, progression of work at NASA, impact on EVA and the U.S. spacesuit, and career accomplishments are of interest to many. Wright, from the JSC History Office, conducted a personal background interview with McBarron. This interview highlighted the influences and decision-making methods that impacted McBarron's technical and management contributions to the space program. Attendees gained insight on the external and internal NASA influences on career progression within the EVA and spacesuit, and the type of accomplishments and technical advances that committed individuals can make. He concluded the presentation with a question and answer period that included a brief discussion about close calls and Russian spacesuits.

  4. McDonald’s as a Cultural Brand in the Landscape of Attitudes of Polish Customers

    Directory of Open Access Journals (Sweden)

    Marcin Komańda

    2016-01-01

    Full Text Available Purpose of the article: The analysis of the attitudes of Polish customers towards McDonald’s based on the identification of opposite social attitudes towards globalisation processes and perception of cultural brands. Methodology/methods: The qualitative analysis of the record of Internet users’ discussion has been conducted. The record of the discussion shall be regarded as an expression of opinion by an incidental group of respondents. For the purposes of the conducted research programmes weftQDA 1.0.1 and QSR NVIVO 10 have been used. Scientific aim: Utilization of postmodern interpretation of the socio-cultural context of running business for purposes of strategic management. Findings: The main differences between the supporters of the attitudes towards McDonald’s were related to two problems. Firstly, the discussion concerns what McDonald’s really is (how its service should be classified. Secondly, the thread of the discourse concerns the quality of McDonald’s offer. Further discussion involved the issues of impact of McDonald’s on the domestic business, and lifestyle of contemporary Poles and their dining habits. Conclusions: The landscape of attitudes of Polish customers towards McDonald’s is the issue of uncertainty in the strategic management within this company. It seems there is a need for paying attention to national cultural features of Poles and different attitudes of contemporary society expressed as a postmodern response to globalisation. Each group of problems mentioned may become an opportunity or a threat for McDonald’s business activity in Poland.

  5. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    Science.gov (United States)

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Layered Fiberconcrete with Non-Homogeneous Fibers Distribution

    OpenAIRE

    Lūsis, V; Krasņikovs, A

    2013-01-01

    The aim of present research is to create fiberconcrete construction with non-homogeneous fibers distribution in it. Traditionally fibers are homogeneously dispersed in a concrete. At the same time in many situations fiberconcretes with homogeneously dispersed fibers are not optimal (majority of added fibers are not participating in a loads bearing process).

  7. Non-Almost Periodicity of Parallel Transports for Homogeneous Connections

    International Nuclear Information System (INIS)

    Brunnemann, Johannes; Fleischhack, Christian

    2012-01-01

    Let A be the affine space of all connections in an SU(2) principal fibre bundle over ℝ 3 . The set of homogeneous isotropic connections forms a line l in A. We prove that the parallel transports for general, non-straight paths in the base manifold do not depend almost periodically on l. Consequently, the embedding l ↪ A does not continuously extend to an embedding l-bar ↪ A-bar of the respective compactifications. Here, the Bohr compactification l-bar corresponds to the configuration space of homogeneous isotropic loop quantum cosmology and A-bar to that of loop quantum gravity. Analogous results are given for the anisotropic case.

  8. Neural plasticity in amplitude of low frequency fluctuation, cortical hub construction, regional homogeneity resulting from working memory training.

    Science.gov (United States)

    Takeuchi, Hikaru; Taki, Yasuyuki; Nouchi, Rui; Sekiguchi, Atsushi; Kotozaki, Yuka; Nakagawa, Seishu; Makoto Miyauchi, Carlos; Sassa, Yuko; Kawashima, Ryuta

    2017-05-03

    Working memory training (WMT) induces changes in cognitive function and various neurological systems. Here, we investigated changes in recently developed resting state functional magnetic resonance imaging measures of global information processing [degree of the cortical hub, which may have a central role in information integration in the brain, degree centrality (DC)], the magnitude of intrinsic brain activity [fractional amplitude of low frequency fluctuation (fALFF)], and local connectivity (regional homogeneity) in young adults, who either underwent WMT or received no intervention for 4 weeks. Compared with no intervention, WMT increased DC in the anatomical cluster, including anterior cingulate cortex (ACC), to the medial prefrontal cortex (mPFC). Furthermore, WMT increased fALFF in the anatomical cluster including the right dorsolateral prefrontal cortex (DLPFC), frontopolar area and mPFC. WMT increased regional homogeneity in the anatomical cluster that spread from the precuneus to posterior cingulate cortex and posterior parietal cortex. These results suggest WMT-induced plasticity in spontaneous brain activity and global and local information processing in areas of the major networks of the brain during rest.

  9. Influence of dose distribution homogeneity on the tumor control probability in heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan

    2001-01-01

    In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%

  10. McStas event logger

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Willendrup, Peter Kjær; Klinkby, Esben Bryndt

    2014-01-01

    Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coatin...

  11. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    Science.gov (United States)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2018-06-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  12. Shawarmas contre McDo.

    Directory of Open Access Journals (Sweden)

    Charles-Édouard de Suremain

    2008-05-01

    Full Text Available La question des identités locales et de leur articulation à la mondialisation et à la standardisation est abordée ici à partir de l’alimentation. Après avoir présenté les lieux où il est possible de manger hors de chez soi à La Paz, je m’attarderai sur les particularités du McDonald’s bolivien. Ces données permettront de suivre la genèse d’un anti-modèle alimentaire, le shawarma, qui a connu son apogée durant la « troisième guerre mondiale », la période démarrant dans la foulée des attentats du 11 septembre 2001. L’hypothèse est que les contestations identitaires dans la sphère alimentaire peuvent s’appuyer sur les logiques de la mondialisation alimentaire et s’ériger contre elle, tout en évitant les écueils de la standardisation alimentaire.Shawarmas versus Macdonald’s. Identities contesting food globalization and standardization (Bolivia Local identities and their tendency toward globalization and standardization is analyzed here with food as an example. After briefly reviewing the different places where one can eat out in La Paz, the particularities of the Bolivian McDonald’s are outlined. These data allow us to witness the birth of a food anti-model, the shawarma, which reached its pinnacle during the so-called ‘Third World War’, the period following September 11, 2001. The hypothesis is that identity contestations in the food sphere can be based on the logic of food globalization -and at the same time be constructed against them- avoiding food standardization pitfalls.

  13. Polyvinylpyrrolidone-Based Bio-Ink Improves Cell Viability and Homogeneity during Drop-On-Demand Printing

    Directory of Open Access Journals (Sweden)

    Wei Long Ng

    2017-02-01

    Full Text Available Drop-on-demand (DOD bioprinting has attracted huge attention for numerous biological applications due to its precise control over material volume and deposition pattern in a contactless printing approach. 3D bioprinting is still an emerging field and more work is required to improve the viability and homogeneity of printed cells during the printing process. Here, a general purpose bio-ink was developed using polyvinylpyrrolidone (PVP macromolecules. Different PVP-based bio-inks (0%–3% w/v were prepared and evaluated for their printability; the short-term and long-term viability of the printed cells were first investigated. The Z value of a bio-ink determines its printability; it is the inverse of the Ohnesorge number (Oh, which is the ratio between the Reynolds number and a square root of the Weber number, and is independent of the bio-ink velocity. The viability of printed cells is dependent on the Z values of the bio-inks; the results indicated that the cells can be printed without any significant impairment using a bio-ink with a threshold Z value of ≤9.30 (2% and 2.5% w/v. Next, the cell output was evaluated over a period of 30 min. The results indicated that PVP molecules mitigate the cell adhesion and sedimentation during the printing process; the 2.5% w/v PVP bio-ink demonstrated the most consistent cell output over a period of 30 min. Hence, PVP macromolecules can play a critical role in improving the cell viability and homogeneity during the bioprinting process.

  14. A Correlated Random Effects Model for Non-homogeneous Markov Processes with Nonignorable Missingness.

    Science.gov (United States)

    Chen, Baojiang; Zhou, Xiao-Hua

    2013-05-01

    Life history data arising in clusters with prespecified assessment time points for patients often feature incomplete data since patients may choose to visit the clinic based on their needs. Markov process models provide a useful tool describing disease progression for life history data. The literature mainly focuses on time homogeneous process. In this paper we develop methods to deal with non-homogeneous Markov process with incomplete clustered life history data. A correlated random effects model is developed to deal with the nonignorable missingness, and a time transformation is employed to address the non-homogeneity in the transition model. Maximum likelihood estimate based on the Monte-Carlo EM algorithm is advocated for parameter estimation. Simulation studies demonstrate that the proposed method works well in many situations. We also apply this method to an Alzheimer's disease study.

  15. A Novel Entropy-Based Centrality Approach for Identifying Vital Nodes in Weighted Networks

    Directory of Open Access Journals (Sweden)

    Tong Qiao

    2018-04-01

    Full Text Available Measuring centrality has recently attracted increasing attention, with algorithms ranging from those that simply calculate the number of immediate neighbors and the shortest paths to those that are complicated iterative refinement processes and objective dynamical approaches. Indeed, vital nodes identification allows us to understand the roles that different nodes play in the structure of a network. However, quantifying centrality in complex networks with various topological structures is not an easy task. In this paper, we introduce a novel definition of entropy-based centrality, which can be applicable to weighted directed networks. By design, the total power of a node is divided into two parts, including its local power and its indirect power. The local power can be obtained by integrating the structural entropy, which reveals the communication activity and popularity of each node, and the interaction frequency entropy, which indicates its accessibility. In addition, the process of influence propagation can be captured by the two-hop subnetworks, resulting in the indirect power. In order to evaluate the performance of the entropy-based centrality, we use four weighted real-world networks with various instance sizes, degree distributions, and densities. Correspondingly, these networks are adolescent health, Bible, United States (US airports, and Hep-th, respectively. Extensive analytical results demonstrate that the entropy-based centrality outperforms degree centrality, betweenness centrality, closeness centrality, and the Eigenvector centrality.

  16. Dosimetric effects of an air cavity for the SAVI partial breast irradiation applicator

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, Susan L.; Pino, Ramiro [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri 63110 (United States); Department of Radiation Oncology, Methodist Hospital, Houston, Texas 77030 and Texas Cancer Clinic, San Antonio, Texas 78240 (United States)

    2010-08-15

    Purpose: To investigate the dosimetric effect of the air inside the SAVI partial breast irradiation device. Methods: The authors have investigated how the air inside the SAVI partial breast irradiation device changes the delivered dose from the homogeneously calculated dose. Measurements were made with the device filled with air and water to allow comparison to a homogenous dose calculation done by the treatment planning system. Measurements were made with an ion chamber, TLDs, and film. Monte Carlo (MC) simulations of the experiment were done using the EGSnrc suite. The MC model was validated by comparing the water-filled calculations to those from a commercial treatment planning system. Results: The magnitude of the dosimetric effect depends on the size of the cavity, the arrangement of sources, and the relative dwell times. For a simple case using only the central catheter of the largest device, MC results indicate that the dose at the prescription point 1 cm away from the air-water boundary is about 9% higher than the homogeneous calculation. Independent measurements in a water phantom with a similar air cavity gave comparable results. MC simulation of a realistic multidwell position plan showed discrepancies of about 5% on average at the prescription point for the largest device. Conclusions: The dosimetric effect of the air cavity is in the range of 3%-9%. Unless a heterogeneous dose calculation algorithm is used, users should be aware of the possibility of small treatment planning dose errors for this device and make modifications to the treatment delivery, if necessary.

  17. McClintock's challenge in the 21st century

    KAUST Repository

    Fedoroff, Nina V.

    2012-11-13

    In 1950, Barbara McClintock published a Classic PNAS article, "The origin and behavior of mutable loci in maize," which summarized the evidence leading to her discovery of transposition. The article described a number of genome alterations revealed through her studies of the Dissociation locus, the first mobile genetic element she identified. McClintock described the suite of nuclear events, including transposon activation and various chromosome aberrations and rearrangements, that unfolded in the wake of genetic crosses that brought together two broken chromosomes 9. McClintock left future generations with the challenge of understanding how genomes respond to genetic and environmental stresses by mounting adaptive responses that frequently include genome restructuring.

  18. McEvoy, Kieran; McGregor, Lorna, Transitional Justice from below. Grassroots Activism and the Struggle for Change

    Directory of Open Access Journals (Sweden)

    José M. Atiles-Osoria

    2012-10-01

    Full Text Available El texto editado por Kieran McEvoy y Lorna McGregor representa un esfuerzo por repensar, redefinir e introducir un debate en el seno de la literatura y de las corrientes de estudio sobre la justicia transicional. Generalmente, la justicia transicional ha sido pensada como un conglomerado de estrategias jurídico‑políticas y socio‑económicas implementadas para lidiar con las violaciones de los derechos humanos, con la violencia política del pasado y los procesos de reconstrucción del Estado pos...

  19. Meelis Lao kuulutas eile sõja McDonald'sile / Peeter Raidla

    Index Scriptorium Estoniae

    Raidla, Peeter, 1955-

    2004-01-01

    Ärimees Meelis Lao lasi rendivaidlusele viidates sulgeda Viru tänava kiirtoitlustusasutuse McDonald's. Vt. samas: McDonald'si rendivaidlus küünib aastate taha; McDonald's andis asja politseisse; Meelis Lao kannab probleemide lahendaja tiitlit

  20. Homogenization approach in engineering

    International Nuclear Information System (INIS)

    Babuska, I.

    1975-10-01

    Homogenization is an approach which studies the macrobehavior of a medium by its microproperties. Problems with a microstructure play an essential role in such fields as mechanics, chemistry, physics, and reactor engineering. Attention is concentrated on a simple specific model problem to illustrate results and problems typical of the homogenization approach. Only the diffusion problem is treated here, but some statements are made about the elasticity of composite materials. The differential equation is solved for linear cases with and without boundaries and for the nonlinear case. 3 figures, 1 table

  1. Astronauts McNair and Stewart prepare for reentry

    Science.gov (United States)

    1984-01-01

    Astronauts Ronald E. McNair and Robert L. Stewart prepare for the re-entry phase of the shuttle Challenger near the end of the 41-B mission. The are stationed behind the crew commander and pilot. Stewart is already wearing his helmet. McNair is stowing some of his gear.

  2. Genetic Homogenization of Composite Materials

    Directory of Open Access Journals (Sweden)

    P. Tobola

    2009-04-01

    Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.

  3. New developments in the McStas neutron instrument simulation package

    International Nuclear Information System (INIS)

    Willendrup, P K; Knudsen, E B; Klinkby, E; Nielsen, T; Farhi, E; Filges, U; Lefmann, K

    2014-01-01

    The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.

  4. Drilling and testing specifications for the McGee well

    International Nuclear Information System (INIS)

    Patterson, J.K.

    1982-01-01

    The McGee Well is a part of the Basalt Waste Isolation Project's subsurface site selection and characterization activities. Information from the McGee Well support site hydrologic characterization and repository design. These test specifications include details for the drilling and testing of the McGee. It includes the predicted stratigraphy, the drilling requirements, description of tests to be conducted, intervals selected for hydrologic testing, and a schedule of the drilling and testing activities. 19 refs., 10 figs., 7 tabs

  5. Integration of OpenMC methods into MAMMOTH and Serpent

    Energy Technology Data Exchange (ETDEWEB)

    Kerby, Leslie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Idaho State Univ., Idaho Falls, ID (United States); DeHart, Mark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tumulak, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States); Univ. of Michigan, Ann Arbor, MI (United States)

    2016-09-01

    OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.

  6. Homogenization of steady-state creep of porous metals using three-dimensional microstructural reconstructions

    DEFF Research Database (Denmark)

    Kwok, Kawai; Boccaccini, Dino; Persson, Åsa Helen

    2016-01-01

    The effective steady-state creep response of porous metals is studied by numerical homogenization and analytical modeling in this paper. The numerical homogenization is based on finite element models of three-dimensional microstructures directly reconstructed from tomographic images. The effects ...... model, and closely matched by the Gibson-Ashby compression and the Ramakrishnan-Arunchalam creep models. [All rights reserved Elsevier]....

  7. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    Science.gov (United States)

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. McDonald’s Corporation - 2015 (MCD

    Directory of Open Access Journals (Sweden)

    Alen Badal

    2017-07-01

    Full Text Available McDonald’s Corporation, 2015 is aiming to enlighten the “Experience of the Future” for consumers, with a special focus on the ‘younger’ generation. Beginning in 2015 and moving forward, McDonald’s has operationalized the functions of its strategy to bett er serve consumers with such offerings as trial-testing a build-your-burger strategy with the order being served at the table, known as the “Create Your Taste” program. The restaurant chain has introduced the all-day breakfast menu and ‘McPick 2’ for $5.00. Additionally, the company has engaged consumers by way of social media and is interested in having a smart phone application in use. Other roll-outs include processing transactions by way of mobile-payment with such channels as ‘Google Wallet, Soft card and Apple Pay.’ The fast-food giant continues to test a variety of strategies at select locations aimed at increasing shareholder value as a result of both introducing and modifying the point-of-sale services and food & beverage offerings¹.

  9. Ecological effects of contaminants in McCoy Branch, 1991--1993

    Energy Technology Data Exchange (ETDEWEB)

    Ryon, M.G. [ed.

    1996-09-01

    The 1984 Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act (RCRA) required assessment of all current and former solid waste management units. Following guidelines under RCRA and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a remedial investigation (RI) was required of the Y-12 Plant for their filled coal ash pond (FCAP) and associated areas on McCoy Branch. The RI process was initiated and assessments were presented. Because the disposal of coal ash in the ash pond, McCoy Branch, and Rogers Quarry was not consistent with the Tennessee Water Quality Act, several remediation steps were implemented between 1986 and 1994 for McCoy Branch to address disposal problems. The required ecological risk assessments of McCoy Branch watershed included provisions for biological monitoring of the watershed. The objectives of the biological monitoring were to (1) document changes in biological quality of McCoy Branch after completion of a pipeline bypassing upper McCoy Branch and further, after termination of all discharges to Rogers Quarry, (2) provide guidance on the need for additional remediation, and (3) evaluate the effectiveness of implemented remedial actions. The data from the biological monitoring program may also determine whether the goals of protection of human health and the environment of McCoy Branch are being accomplished.

  10. Red hair is the null phenotype of MC1R.

    Science.gov (United States)

    Beaumont, Kimberley A; Shekar, Sri N; Cook, Anthony L; Duffy, David L; Sturm, Richard A

    2008-08-01

    The Melanocortin-1 Receptor (MC1R) is a G-protein coupled receptor, which is responsible for production of the darker eumelanin pigment and the tanning response. The MC1R gene has many polymorphisms, some of which have been linked to variation in pigmentation phenotypes within human populations. In particular, the p.D84E, p.R151C, p.R160W and p.D294 H alleles have been strongly associated with red hair, fair skin and increased skin cancer risk. These red hair colour (RHC) variants are relatively well described and are thought to result in altered receptor function, while still retaining varying levels of signaling ability in vitro. The mouse Mc1r null phenotype is yellow fur colour, the p.R151C, p.R160W and p.D294 H alleles were able to partially rescue this phenotype, leading to the question of what the true null phenotype of MC1R would be in humans. Due to the rarity of MC1R null alleles in human populations, they have only been found in the heterozygous state until now. We report here the first case of a homozygous MC1R null individual, phenotypic analysis indicates that red hair and fair skin is found in the absence of MC1R function.

  11. CT and MRI analysis of central nervous system Rosai-Dorfman disease

    International Nuclear Information System (INIS)

    Zhang Jiatang; Lang Senyang; Pu Chuanqiang; Zhu Ruyuan; Wang Dianjun

    2008-01-01

    Objective: To study the CT and MRI imaging features of central nervous system Rosai-Dorfman disease and to enhance knowledge and differential diagnostic ability for central nervous system Rosai-Doffman disease. Methods: The CT and MRI imaging appearances in 4 cases of pathologically proven Rosai-Dorfman disease were retrospectively evaluated and the literature of central nervous system Rosai- Dorfman disease were reviewed. Results: Two cases had cranial CT scans, 4 cases had cranial MRI scans. On CT scans, cerebral edema was demonstrated in one case and the other case was normal. MRI scans showed the lesions were solitary in saddle area in 3 cases, and multiple in anterior cranial fossa in 1 case. The lesions exhibited iso- to hypointensity on both T 1 WI and T 2 WI images. Following intravenous injection of contrast medium, ring-like enhancement was seen in 2 cases and homogeneous enhancement in 1 case. Nodular enhancement was seen in the case of multiple lesions in the anterior cranial fossa. All lesions were dural-based. Conclusions: In patients with fever, headache, elevation of the erythrocyte sedimentation rate (ESR) and a polyclonal increase in γ-globulins, the possibility of central nervous system Rosai-Dorfman disease should be considered when single or multiple dural-based mass lesions, especially in sellar region, were identified by CT and MRI. (authors)

  12. 75 FR 76394 - Central Electric Power Cooperative, Inc.: Notice of Intent To Prepare an Environmental Impact...

    Science.gov (United States)

    2010-12-08

    ... Elementary School, 8900 Highway 17 North, McClellanville, SC 29458. RUS, Central Electric, and Mangi... Register and in local newspapers. The USFS may issue a separate ROD for the proposal, which may be subject...

  13. Kinematic analysis of melange fabrics: Examples and applications from the McHugh Complex, Kenai Peninsula, Alaska

    Science.gov (United States)

    Kusky, T.M.; Bradley, D.C.

    1999-01-01

    Permian to Cretaceous melange of the McHugh Complex on the Kenai Peninsula, south-central Alaska includes blocks and belts of graywacke, argillite, limestone, chert, basalt, gabbro, and ultramafic rocks, intruded by a variety of igneous rocks. An oceanic plate stratigraphy is repeated hundreds of times across the map area, but most structures at the outcrop scale extend lithological layering. Strong rheological units occur as blocks within a matrix that flowed around the competent blocks during deformation, forming broken formation and melange. Deformation was noncoaxial, and disruption of primary layering was a consequence of general strain driven by plate convergence in a relatively narrow zone between the overriding accretionary wedge and the downgoing, generally thinly sedimented oceanic plate. Soft-sediment deformation processes do not appear to have played a major role in the formation of the melange. A model for deformation at the toe of the wedge is proposed in which layers oriented at low angles to ??1 are contracted in both the brittle and ductile regimes, layers at 30-45??to ??1 are extended in the brittle regime and contracted in the ductile regime, and layers at angles greater than 45??to ??1 are extended in both the brittle and ductile regimes. Imbrication in thrust duplexes occurs at deeper levels within the wedge. Many structures within melange of the McHugh Complex are asymmetric and record kinematic information consistent with the inferred structural setting in an accretionary wedge. A displacement field for the McHugh Complex on the lower Kenai Peninsula includes three belts: an inboard belt of Late Triassic rocks records west-to-east-directed slip of hanging walls, a central belt of predominantly Early Jurassic rocks records north-south directed displacements, and Early Cretaceous rocks in an outboard belt preserve southwest-northeast directed slip vectors. Although precise ages of accretion are unknown, slip directions are compatible with

  14. Insect Biometrics: Optoacoustic Signal Processing and Its Applications to Remote Monitoring of McPhail Type Traps.

    Science.gov (United States)

    Potamitis, Ilyas; Rigakis, Iraklis; Fysarakis, Konstantinos

    2015-01-01

    Monitoring traps are important components of integrated pest management applied against important fruit fly pests, including Bactrocera oleae (Gmelin) and Ceratitis capitata (Widemann), Diptera of the Tephritidae family, which effect a crop-loss/per year calculated in billions of euros worldwide. Pests can be controlled with ground pesticide sprays, the efficiency of which depends on knowing the time, location and extent of infestations as early as possible. Trap inspection is currently carried out manually, using the McPhail trap, and the mass spraying is decided based on a decision protocol. We introduce the term 'insect biometrics' in the context of entomology as a measure of a characteristic of the insect (in our case, the spectrum of its wingbeat) that allows us to identify its species and make devices to help face old enemies with modern means. We modify a McPhail type trap into becoming electronic by installing an array of photoreceptors coupled to an infrared emitter, guarding the entrance of the trap. The beating wings of insects flying in the trap intercept the light and the light fluctuation is turned to a recording. Custom-made electronics are developed that are placed as an external add-on kit, without altering the internal space of the trap. Counts from the trap are transmitted using a mobile communication network. This trap introduces a new automated remote-monitoring method different to audio and vision-based systems. We evaluate our trap in large number of insects in the laboratory by enclosing the electronic trap in insectary cages. Our experiments assess the potential of delivering reliable data that can be used to initialize reliably the spraying process at large scales but to also monitor the impact of the spraying process as it eliminates the time-lag between acquiring and delivering insect counts to a central agency.

  15. McCarthyism and American Opera L’Opéra américain face au McCarthyisme

    Directory of Open Access Journals (Sweden)

    Klaus-Dieter Gross

    2009-11-01

    Full Text Available L’atmosphère anti-communiste qui caractérisa les États-Unis entre la fin de la seconde Guerre Mondiale et la fin des années 1950, et qui culmina avec des mesures légales ou extra-légales prises par le Comité des Activités Anti-américaines et par le sénateur McCarthy, eut une influence décisive sur l’opéra américain. Quelques rares ouvrages manifestent un soutien diffus pour le McCarthyisme (Still et Nabobov, tandis que d’autres nient son impact en promouvant des idées de gauche (Robinson et Blitzstein. D’autres œuvres incorporent la peur de la vague rouge dans leurs thèmes, mais sans lui donner une importance autre que secondaire. Enfin, trois opéras (Bernstein, Floyd et Ward abordent frontalement la question du fonctionnement rituel et des méthodes liberticides du McCarthyisme. Le déclin de ce mouvement coïncida, vers le début des années 1960, avec l’émergence d’un style opératique moins traditionaliste et plus abstrait.

  16. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  17. Management závěrečného turnaje McDonalds cup 2015

    OpenAIRE

    Kafka, Dominik

    2016-01-01

    Title: Management of the McDonald's Cup 2015 final tournament Objectives: The main objective of this thesis is to provide a detailed analysis of the management of the McDonald's Cup 2015 final tournament known as the Festival of Football, to present its strengths, weaknesses, potential opportunities and threats and then, based on previous analyses, to create a list of suggestions and recommendations leading to the elimination of the weaknesses and threats and thus to development, increase of ...

  18. String pair production in non homogeneous backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, S. [Department of Physics “E. Fermi” University of Pisa, and INFN - Sezione di Pisa,Largo Pontecorvo, 3, Ed. C, 56127 Pisa (Italy); Rabinovici, E. [Racah Institute of Physics, The Hebrew University of Jerusalem,91904 Jerusalem (Israel); Tallarita, G. [Departamento de Ciencias, Facultad de Artes Liberales,Universidad Adolfo Ibáñez, Santiago 7941169 (Chile)

    2016-04-28

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  19. String pair production in non homogeneous backgrounds

    International Nuclear Information System (INIS)

    Bolognesi, S.; Rabinovici, E.; Tallarita, G.

    2016-01-01

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  20. Midlatitude Continental Convective Clouds Experiment (MC3E)

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, MP; Petersen, WA; Del Genio, AD; Giangrande, SE; Heymsfield, A; Heymsfield, G; Hou, AY; Kollias, P; Orr, B; Rutledge, SA; Schwaller, MR; Zipser, E

    2010-04-01

    Convective processes play a critical role in the Earth’s energy balance through the redistribution of heat and moisture in the atmosphere and subsequent impacts on the hydrologic cycle. Global observation and accurate representation of these processes in numerical models is vital to improving our current understanding and future simulations of Earth’s climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales that are associated with convective and stratiform precipitation processes; therefore, they must turn to parameterization schemes to represent these processes. In turn, the physical basis for these parameterization schemes needs to be evaluated for general application under a variety of atmospheric conditions. Analogously, space-based remote sensing algorithms designed to retrieve related cloud and precipitation information for use in hydrological, climate, and numerical weather prediction applications often rely on physical “parameterizations” that reliably translate indirectly related instrument measurements to the physical quantity of interest (e.g., precipitation rate). Importantly, both spaceborne retrieval algorithms and model convective parameterization schemes traditionally rely on field campaign data sets as a basis for evaluating and improving the physics of their respective approaches. The Midlatitude Continental Convective Clouds Experiment (MC3E) will take place in central Oklahoma during the April–May 2011 period. The experiment is a collaborative effort between the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility and the National Aeronautics and Space Administration’s (NASA) Global Precipitation Measurement (GPM) mission Ground Validation (GV) program. The field campaign leverages the unprecedented observing infrastructure currently available in the central United States

  1. TU-AB-BRC-11: Moving a GPU-OpenCL-Based Monte Carlo (MC) Dose Engine Towards Routine Clinical Use: Automatic Beam Commissioning and Efficient Source Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Folkerts, M; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States); Li, Y [Beihang University, Beijing (China)

    2016-06-15

    Purpose: We have previously developed a GPU-OpenCL-based MC dose engine named goMC with built-in analytical linac beam model. To move goMC towards routine clinical use, we have developed an automatic beam-commissioning method, and an efficient source sampling strategy to facilitate dose calculations for real treatment plans. Methods: Our commissioning method is to automatically adjust the relative weights among the sub-sources, through an optimization process minimizing the discrepancies between calculated dose and measurements. Six models built for Varian Truebeam linac photon beams (6MV, 10MV, 15MV, 18MV, 6MVFFF, 10MVFFF) were commissioned using measurement data acquired at our institution. To facilitate dose calculations for real treatment plans, we employed inverse sampling method to efficiently incorporate MLC leaf-sequencing into source sampling. Specifically, instead of sampling source particles control-point by control-point and rejecting the particles blocked by MLC, we assigned a control-point index to each sampled source particle, according to MLC leaf-open duration of each control-point at the pixel where the particle intersects the iso-center plane. Results: Our auto-commissioning method decreased distance-to-agreement (DTA) of depth dose at build-up regions by 36.2% averagely, making it within 1mm. Lateral profiles were better matched for all beams, with biggest improvement found at 15MV for which root-mean-square difference was reduced from 1.44% to 0.50%. Maximum differences of output factors were reduced to less than 0.7% for all beams, with largest decrease being from1.70% to 0.37% found at 10FFF. Our new sampling strategy was tested on a Head&Neck VMAT patient case. Achieving clinically acceptable accuracy, the new strategy could reduce the required history number by a factor of ∼2.8 given a statistical uncertainty level and hence achieve a similar speed-up factor. Conclusion: Our studies have demonstrated the feasibility and effectiveness of

  2. A Test for Parameter Homogeneity in CO2Panel EKC Estimations

    International Nuclear Information System (INIS)

    Dijkgraaf, E.; Vollebergh, H.R.J.

    2005-01-01

    This paper casts doubt on empirical results based on panel estimations of an 'inverted-U' relationship between per capita GDP and pollution. Using a new dataset for OECD countries on carbon dioxide emissions for the period 1960-1997, we find that the crucial assumption of homogeneity across countries is problematic. Decisively rejected are model specifications that feature even weaker homogeneity assumptions than are commonly used. Furthermore, our results challenge the existence of an overall Environmental Kuznets Curve for carbon dioxide emissions

  3. Rotated Walsh-Hadamard Spreading with Robust Channel Estimation for a Coded MC-CDMA System

    Directory of Open Access Journals (Sweden)

    Raulefs Ronald

    2004-01-01

    Full Text Available We investigate rotated Walsh-Hadamard spreading matrices for a broadband MC-CDMA system with robust channel estimation in the synchronous downlink. The similarities between rotated spreading and signal space diversity are outlined. In a multiuser MC-CDMA system, possible performance improvements are based on the chosen detector, the channel code, and its Hamming distance. By applying rotated spreading in comparison to a standard Walsh-Hadamard spreading code, a higher throughput can be achieved. As combining the channel code and the spreading code forms a concatenated code, the overall minimum Hamming distance of the concatenated code increases. This asymptotically results in an improvement of the bit error rate for high signal-to-noise ratio. Higher convolutional channel code rates are mostly generated by puncturing good low-rate channel codes. The overall Hamming distance decreases significantly for the punctured channel codes. Higher channel code rates are favorable for MC-CDMA, as MC-CDMA utilizes diversity more efficiently compared to pure OFDMA. The application of rotated spreading in an MC-CDMA system allows exploiting diversity even further. We demonstrate that the rotated spreading gain is still present for a robust pilot-aided channel estimator. In a well-designed system, rotated spreading extends the performance by using a maximum likelihood detector with robust channel estimation at the receiver by about 1 dB.

  4. Homogeneous M2 duals

    International Nuclear Information System (INIS)

    Figueroa-O’Farrill, José; Ungureanu, Mara

    2016-01-01

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS 4 ×P 7 , with P riemannian and homogeneous under the action of SO(5), or S 4 ×Q 7 with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  5. Homogeneous M2 duals

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa-O’Farrill, José [School of Mathematics and Maxwell Institute for Mathematical Sciences,The University of Edinburgh,James Clerk Maxwell Building, The King’s Buildings, Peter Guthrie Tait Road,Edinburgh EH9 3FD, Scotland (United Kingdom); Ungureanu, Mara [Humboldt-Universität zu Berlin, Institut für Mathematik,Unter den Linden 6, 10099 Berlin (Germany)

    2016-01-25

    Motivated by the search for new gravity duals to M2 branes with N>4 supersymmetry — equivalently, M-theory backgrounds with Killing superalgebra osp(N|4) for N>4 — we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra so(n)⊕so(3,2) for n=5,6,7. We find that there are no new backgrounds with n=6,7 but we do find a number of new (to us) backgrounds with n=5. All backgrounds are metrically products of the form AdS{sub 4}×P{sup 7}, with P riemannian and homogeneous under the action of SO(5), or S{sup 4}×Q{sup 7} with Q lorentzian and homogeneous under the action of SO(3,2). At least one of the new backgrounds is supersymmetric (albeit with only N=2) and we show that it can be constructed from a supersymmetric Freund-Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.

  6. Is McMurray′s osteotomy obsolete?

    Directory of Open Access Journals (Sweden)

    Phaltankar P

    1995-10-01

    Full Text Available A review of the method of performing, advantages, disadvantages of McMurray′s displacement osteotomy with regard to treatment of nonunion of transcervical fracture neck femur with viable femoral head was carried out in this study of ten cases, in view of the abandonment of the procedure in favour of angulation osteotomy. Good results obtained in the series attest to the usefulness of McMurray′s osteotomy in the difficult problem of nonunion of transcervical fracture neck femur in well selected cases with certain advantages over the angulation osteotomy due to the ′Armchair effect′.

  7. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  8. Two-Dimensional Homogeneous Fermi Gases

    Science.gov (United States)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  9. Diffusion piecewise homogenization via flux discontinuity ratios

    International Nuclear Information System (INIS)

    Sanchez, Richard; Dante, Giorgio; Zmijarevic, Igor

    2013-01-01

    We analyze piecewise homogenization with flux-weighted cross sections and preservation of averaged currents at the boundary of the homogenized domain. Introduction of a set of flux discontinuity ratios (FDR) that preserve reference interface currents leads to preservation of averaged region reaction rates and fluxes. We consider the class of numerical discretizations with one degree of freedom per volume and per surface and prove that when the homogenization and computing meshes are equal there is a unique solution for the FDRs which exactly preserve interface currents. For diffusion sub-meshing we introduce a Jacobian-Free Newton-Krylov method and for all cases considered obtain an 'exact' numerical solution (eight digits for the interface currents). The homogenization is completed by extending the familiar full assembly homogenization via flux discontinuity factors to the sides of regions laying on the boundary of the piecewise homogenized domain. Finally, for the familiar nodal discretization we numerically find that the FDRs obtained with no sub-mesh (nearly at no cost) can be effectively used for whole-core diffusion calculations with sub-mesh. This is not the case, however, for cell-centered finite differences. (authors)

  10. Homogeneous deuterium exchange using rhenium and platinum chloride catalysts

    International Nuclear Information System (INIS)

    Fawdry, R.M.

    1979-01-01

    Previous studies of homogeneous hydrogen isotope exchange are mostly confined to one catalyst, the tetrachloroplatinite salt. Recent reports have indicated that chloride salts of iridium and rhodium may also be homogeneous exchange catalysts similar to the tetrachloroplatinite, but with much lower activities. Exchange by these homogeneous catalysts is frequently accompanied by metal precipitation with the termination of homogeneous exchange, particularly in the case of alkane exchange. The studies presented in this thesis describe two different approaches to overcome this limitation of homogeneous hydrogen isotope exchange catalysts. The first approach was to improve the stability of an existing homogeneous catalyst and the second was to develop a new homogeneous exchange catalyst which is free of the instability limitation

  11. Effect of homogenization heat treatments on the cast structure and tensile properties of nickel-base superalloy ATI 718Plus in the presence of boron and zirconium additions

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, Seyed Ali, E-mail: saliho3ini@gmail.com; Madar, Karim Zangeneh; Abbasi, Seyed Mehdi

    2017-03-24

    The effect of homogenization heat treatment on cast structure, hardness, and tensile properties of the nickel-based superalloy 718plus in the presence of boron and zirconium additives were investigated. For this purpose, five alloys with different contents of boron (0.00–0.016 wt%) and zirconium (0.0–0.1 wt%) were cast by double vacuum process VIM/VAR and then were homogenized at 1075–1175 °C for 5–25 h. Microstructural investigation by OM and SEM and phase analysis by XRD were done and then hardness and high temperature tensile tests were performed on the homogenized alloys. The results show that the amount of the Laves phase is reduced by increases in time and temperature of homogenization. It was also found that increases in duration of homogenization at 1075 °C results in improving strength and ductility, while duration increase at 1175 °C is accompanied with degradation of them, which caused the reduction of needle-like delta phase on grain boundaries. Boron and zirconium had negative effects on the strength and ductility of the alloy by increasing the amount of Laves in the cast structure. By increasing these elements in alloy composition, more time is needed in order to fully eliminate the Laves by homogenization treatment.

  12. A Survey on the Taxonomy of Cluster-Based Routing Protocols for Homogeneous Wireless Sensor Networks

    Science.gov (United States)

    Naeimi, Soroush; Ghafghazi, Hamidreza; Chow, Chee-Onn; Ishii, Hiroshi

    2012-01-01

    The past few years have witnessed increased interest among researchers in cluster-based protocols for homogeneous networks because of their better scalability and higher energy efficiency than other routing protocols. Given the limited capabilities of sensor nodes in terms of energy resources, processing and communication range, the cluster-based protocols should be compatible with these constraints in either the setup state or steady data transmission state. With focus on these constraints, we classify routing protocols according to their objectives and methods towards addressing the shortcomings of clustering process on each stage of cluster head selection, cluster formation, data aggregation and data communication. We summarize the techniques and methods used in these categories, while the weakness and strength of each protocol is pointed out in details. Furthermore, taxonomy of the protocols in each phase is given to provide a deeper understanding of current clustering approaches. Ultimately based on the existing research, a summary of the issues and solutions of the attributes and characteristics of clustering approaches and some open research areas in cluster-based routing protocols that can be further pursued are provided. PMID:22969350

  13. DNA methylation-based classification of central nervous system tumours

    DEFF Research Database (Denmark)

    Capper, David; Jones, David T.W.; Sill, Martin

    2018-01-01

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter-observer variabil......Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter......-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show...

  14. An Interview with Joe McMann: His Life Lessons

    Science.gov (United States)

    McMann, Joe

    2011-01-01

    Pica Kahn conducted "An Interview with Joe McMann: His Life Lessons" on May 23, 2011. With over 40 years of experience in the aerospace industry, McMann has gained a wealth of knowledge. Many have been interested in his biography, progression of work at NASA, impact on the U.S. spacesuit, and career accomplishments. This interview highlighted the influences and decision-making methods that impacted his technical and management contributions to the space program. McMann shared information about the accomplishments and technical advances that committed individuals can make.

  15. On coincidence of Pettis and McShane integrability

    Czech Academy of Sciences Publication Activity Database

    Fabian, Marián

    2015-01-01

    Roč. 65, č. 1 (2015), s. 83-106 ISSN 0011-4642 R&D Projects: GA ČR(CZ) GAP201/12/0290 Institutional support: RVO:67985840 Keywords : Pettis integral * McShane integral * MC-filling family Subject RIV: BA - General Mathematics Impact factor: 0.284, year: 2015 http://link.springer.com/article/10.1007/s10587-015-0161-x

  16. Hesburger asub ründama, McDonalds teeb kaevikuid / Andres Reimer

    Index Scriptorium Estoniae

    Reimer, Andres

    2005-01-01

    Hesburger tahab Eestis oma restoranide hulka kolmekordistada. McDonald's Eestis ei laiene, kuid kavatseb ettevõtte efektiivsust suurendada. Eesti hamburgerirestoranide kett Nehatu on konkurentsist välja langenud. Tabelid: Tulemused; Hamburgerirestoranid Eestis; Hamburger ja McDonald's maailmas. Lisad: Hakkas Hesburgeri partneriks; Hakka McDonald'si partneriks

  17. The homogeneous geometries of real hyperbolic space

    DEFF Research Database (Denmark)

    Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis

    We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use...... our analysis to show that the moduli space of homogeneous structures on real hyperbolic space has two connected components....

  18. Taxonomic Review of the Caudatella heterocaudata (McDunnough and C. hystrix (Traver Complexes (Insecta: Ephemeroptera: Ephemerellidae

    Directory of Open Access Journals (Sweden)

    Luke M. Jacobus

    2010-01-01

    Full Text Available Caudatella columbiella (McDunnough, 1935, new combination, (Insecta: Ephemeroptera: Ephemerellidae is removed from synonymy with Caudatella heterocaudata (McDunnough, 1929, and a new junior synonym is recognized, based on comparative examination of type material and larval exuviae associated with adults from the type locale of C. columbiella (=C. californica (Allen and Edmunds, 1961, new status, new synonym. Caudatella circia (Allen and Edmunds, 1961, new status, is recognized as a strict specific synonym of C. heterocaudata (McDunnough, 1929 (=C. circia (Allen and Edmunds, 1961, new synonym. A neotype is designated for Caudatella hystrix (Traver, 1934, based on a specimen collected in Western Montana, USA, during June 2000. Morphological differences between the type specimen of C. hystrix and the type specimens of its two junior synonyms, Ephemerella cascadia Allen and Edmunds, 1961, and E. spinosa Mayo, 1952, are detailed. An identification key for larvae of the genus Caudatella is included.

  19. Spinor structures on homogeneous spaces

    International Nuclear Information System (INIS)

    Lyakhovskii, V.D.; Mudrov, A.I.

    1993-01-01

    For multidimensional models of the interaction of elementary particles, the problem of constructing and classifying spinor fields on homogeneous spaces is exceptionally important. An algebraic criterion for the existence of spinor structures on homogeneous spaces used in multidimensional models is developed. A method of explicit construction of spinor structures is proposed, and its effectiveness is demonstrated in examples. The results are of particular importance for harmonic decomposition of spinor fields

  20. The Fort McMurray Demonstration Project in Social Marketing: theory, design, and evaluation.

    Science.gov (United States)

    Guidotti, T L; Ford, L; Wheeler, M

    2000-02-01

    The Fort McMurray Demonstration Project in Social Marketing is a multifaceted program that applies the techniques of social marketing to health and safety. This paper describes the origins of the project and the principles on which it was based. VENUE: Fort McMurray, in the province of Alberta, Canada, was selected because the community had several community initiatives already underway and the project had the opportunity to demonstrate "value added." The project is distinguished from others by a model that attempts to achieve mutually reinforcing effects from social marketing in the community as a whole and from workplace safety promotion in particular. Specific interventions sponsored by the project include a media campaign on cable television, public activities in local schools, a community safety audit, and media appearance by a mascot that provides visual identity to the project, a dinosaur named "Safetysaurus." The project integrated its activities with other community initiatives. The evaluation component emphasizes outcome measures. A final evaluation based on injury rates and attitudinal surveys is underway. Baseline data from the first round of surveys have been compiled and published. In 1995, Fort McMurray became the first city in North America to be given membership in the World Health Organization's Safe Community Network.

  1. Simulating vegetation response to climate change in the Blue Mountains with MC2 dynamic global vegetation model

    Directory of Open Access Journals (Sweden)

    John B. Kim

    2018-04-01

    Full Text Available Warming temperatures are projected to greatly alter many forests in the Pacific Northwest. MC2 is a dynamic global vegetation model, a climate-aware, process-based, and gridded vegetation model. We calibrated and ran MC2 simulations for the Blue Mountains Ecoregion, Oregon, USA, at 30 arc-second spatial resolution. We calibrated MC2 using the best available spatial datasets from land managers. We ran future simulations using climate projections from four global circulation models (GCM under representative concentration pathway 8.5. Under this scenario, forest productivity is projected to increase as the growing season lengthens, and fire occurrence is projected to increase steeply throughout the century, with burned area peaking early- to mid-century. Subalpine forests are projected to disappear, and the coniferous forests to contract by 32.8%. Large portions of the dry and mesic forests are projected to convert to woodlands, unless precipitation were to increase. Low levels of change are projected for the Umatilla National Forest consistently across the four GCM’s. For the Wallowa-Whitman and the Malheur National Forest, forest conversions are projected to vary more across the four GCM-based simulations, reflecting high levels of uncertainty arising from climate. For simulations based on three of the four GCMs, sharply increased fire activity results in decreases in forest carbon stocks by the mid-century, and the fire activity catalyzes widespread biome shift across the study area. We document the full cycle of a structured approach to calibrating and running MC2 for transparency and to serve as a template for applications of MC2. Keywords: Climate change, Regional change, Simulation, Calibration, Forests, Fire, Dynamic global vegetation model

  2. Characterization of melanocortin NDP-MSH agonist peptide fragments at the mouse central and peripheral melanocortin receptors.

    Science.gov (United States)

    Haskell-Luevano, C; Holder, J R; Monck, E K; Bauzo, R M

    2001-06-21

    The central melanocortin receptors, melanocortin-4 (MC4R) and melanocortin-3 (MC3R), are involved in the regulation of satiety and energy homeostasis. The MC4R in particular has become a pharmaceutical industry drug target due to its direct involvement in the regulation of food intake and its potential therapeutic application for the treatment of obesity-related diseases. The melanocortin receptors are stimulated by the native ligand, alpha-melanocyte stimulating hormone (alpha-MSH). The potent and enzymatically stable analogue NDP-MSH (Ac-Ser-Tyr-Ser-Nle-Glu-His-DPhe-Arg-Trp-Gly-Lys-Pro-Val-NH(2)) is a lead peptide for the identification of melanocortin amino acids important for receptor molecular recognition and stimulation. We have synthesized nine peptide fragments of NDP-MSH, deleting N- and C-terminal amino acids to determine the "minimally active" sequence of NDP-MSH. Additionally, five peptides were synthesized to study stereochemical inversion at the Phe 7 and Trp 9 positions in attempts to increase tetra- and tripeptide potencies. These peptide analogues were pharmacologically characterized at the mouse melanocortin MC1, MC3, MC4, and MC5 receptors. This study has identified the Ac-His-DPhe-Arg-Trp-NH(2) tetrapeptide as possessing 10 nM agonist activity at the brain MC4R. The tripeptide Ac-DPhe-Arg-Trp-NH(2) possessed micromolar agonist activities at the MC1R, MC4R, and MC5R but only slight stimulatory activity was observed at the MC3R (at up to 100 microM concentration). This study has also examined to importance of both N- and C-terminal NDP-MSH amino acids at the different melanocortin receptors, providing information for drug design and identification of putative ligand-receptor interactions.

  3. Biogeografía de los anfibios anuros de la región central de la República Argentina

    Directory of Open Access Journals (Sweden)

    Bridarolli, María E.

    1994-01-01

    Full Text Available The distribution of sixty anuran taxa in central Argentina (28°- 36°S, 60°-68°W is analyzed, as well as its correspondence with natural environments, taking into account phytogeographic formations, geomorphology, climatologic zones and zoogeographic regions. An isoline map of anurans diversity was constructed. High diversity occurs in the central-east zone of the study area, coincidently with plain environments and heterogenous phytogeographic formations; low values are found in homogenous phytogeographic formations. A dendrogram was obtained following UPGMA procedure, distinguishing 6 groups of phytogeographic associations based on amphibian distributions. A correspondence between natural environments and anurans presence is reported.

  4. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  5. A Comparison of the McMaster and Circumplex Family Assessment Instruments.

    Science.gov (United States)

    Fristad, Mary A.

    1989-01-01

    Compared clinical rating scales and self-report scales from McMaster and Circumplex models of family functioning with families (N=41). Found McMaster instruments had superior sensitivity; greater correspondence between clinical rating scales and family member self-report inventories on McMaster instruments; and lack of support for the curvilinear…

  6. Regional homogeneity of electoral space: comparative analysis (on the material of 100 national cases

    Directory of Open Access Journals (Sweden)

    A. O. Avksentiev

    2015-12-01

    Full Text Available In the article the author examines dependence on electoral behavior from territorial belonging. «Regional homogeneity» and «electoral space» categories are conceptualized. It is argued, that such regional homogeneity is a characteristic of electoral space and can be quantified. Quantitative measurement of government regional homogeneity has direct connection with risk of separatism, civil conflicts, or legitimacy crisis on deviant territories. It is proposed the formulae for evaluation of regional homogeneity quantitative method which has been based on statistics analysis instrument, especially, variation coefficient. Possible directions of study with the use of this index according to individual political subjects and the whole political space (state, region, electoral district are defined. Calculation of appropriate indexes for Ukrainian electoral space (return of 1991­2015 elections and 100 other national cases. The dynamics of Ukraine regional homogeneity on the material of 1991­2015 electoral statistics is analyzed.

  7. Angus McBean - Portraits

    NARCIS (Netherlands)

    Pepper, T.

    2007-01-01

    Angus McBean (1904-90) was one of the most extraordinary British photographers of the twentieth century. In a career that spanned the start of the Second World War through the birth of the 'Swinging Sixties' to the 1980s, he became the most prominent theatre photographer of his generation and, along

  8. The human Ago2 MC region does not contain an eIF4E-like mRNA cap binding motif

    Directory of Open Access Journals (Sweden)

    Grishin Nick V

    2009-01-01

    Full Text Available Abstract Background Argonaute (Ago proteins interact with small regulatory RNAs to mediate gene regulatory pathways. A recent report by Kiriakidou et al. 1 describes an MC sequence region identified in Ago2 that displays similarity to the cap-binding motif in translation initiation factor 4E (eIF4E. In a cap-bound eIF4E structure, two important aromatic residues of the motif stack on either side of a 7-methylguanosine 5'-triphosphate (m7Gppp base. The corresponding Ago2 aromatic residues (F450 and F505 were hypothesized to perform the same cap-binding function. However, the detected similarity between the MC sequence and the eIF4E cap-binding motif was questionable. Results A number of sequence-based and structure-based bioinformatics methods reveal the reported similarity between the Ago2 MC sequence region and the eIF4E cap-binding motif to be spurious. Alternatively, the MC sequence region is confidently assigned to the N-terminus of the Ago piwi module, within the mid domain of experimentally determined prokaryotic Ago structures. Confident mapping of the Ago2 MC sequence region to the piwi mid domain results in a homology-based structure model that positions the identified aromatic residues over 20 Å apart, with one of the aromatic side chains (F450 contributing instead to the hydrophobic core of the domain. Conclusion Correct functional prediction based on weak sequence similarity requires substantial evolutionary and structural support. The evolutionary context of the Ago mid domain suggested by multiple sequence alignment is limited to a conserved hydrophobicity profile required for the fold and a motif following the MC region that binds guide RNA. Mapping of the MC sequence to the mid domain structure reveals Ago2 aromatics that are incompatible with eIF4E-like mRNA cap-binding, yet display some limited local structure similarities that cause the chance sequence match to eIF4E. Reviewers This article was reviewed by Arcady Mushegian

  9. Sex-specific allelic transmission bias suggests sexual conflict at MC1R.

    Science.gov (United States)

    Ducret, Valérie; Gaigher, Arnaud; Simon, Céline; Goudet, Jérôme; Roulin, Alexandre

    2016-09-01

    Sexual conflict arises when selection in one sex causes the displacement of the other sex from its phenotypic optimum, leading to an inevitable tension within the genome - called intralocus sexual conflict. Although the autosomal melanocortin-1-receptor gene (MC1R) can generate colour variation in sexually dichromatic species, most previous studies have not considered the possibility that MC1R may be subject to sexual conflict. In the barn owl (Tyto alba), the allele MC1RWHITE is associated with whitish plumage coloration, typical of males, and the allele MC1RRUFOUS is associated with dark rufous coloration, typical of females, although each sex can express any phenotype. Because each colour variant is adapted to specific environmental conditions, the allele MC1RWHITE may be more strongly selected in males and the allele MC1RRUFOUS in females. We therefore investigated whether MC1R genotypes are in excess or deficit in male and female fledglings compared with the expected Hardy-Weinberg proportions. Our results show an overall deficit of 7.5% in the proportion of heterozygotes in males and of 12.9% in females. In males, interannual variation in assortative pairing with respect to MC1R explained the year-specific deviations from Hardy-Weinberg proportions, whereas in females, the deficit was better explained by the interannual variation in the probability of inheriting the MC1RWHITE or MC1RRUFOUS allele. Additionally, we observed that sons inherit the MC1RRUFOUS allele from their fathers on average slightly less often than expected under the first Mendelian law. Transmission ratio distortion may be adaptive in this sexually dichromatic species if males and females are, respectively, selected to display white and rufous plumages. © 2016 John Wiley & Sons Ltd.

  10. Homogenization of food samples for gamma spectrometry using tetramethylammonium hydroxide and enzymatic digestion

    International Nuclear Information System (INIS)

    Kimi Nishikawa; Abdul Bari; Abdul Jabbar Khan; Xin Li; Traci Menia; Semkow, T.M.

    2017-01-01

    We have developed a method of food sample preparation for gamma spectrometry involving the use of tetramethylammonium hydroxide (TMAH) and/or enzymes such as α-amylase or cellulase for sample homogenization. We demonstrated the effectiveness of this method using food matrices spiked with "6"0Co, "1"3"1I, "1"3"4","1"3"7Cs, and "2"4"1Am radionuclides, homogenized with TMAH (mixed salad, parmesan cheese, and ground beef); enzymes (α-amylase for bread, and cellulase for baked beans); or α-amylase followed by TMAH (cheeseburgers). Procedures were developed which are best compromises between the degree of homogenization, accuracy, speed, and minimizing laboratory equipment contamination. Based on calculated sample biases and z-scores, our results suggest that homogenization using TMAH and enzymes would be a useful method of sample preparation for gamma spectrometry samples during radiological emergencies. (author)

  11. Development of performance tests for an automated MC and A system

    International Nuclear Information System (INIS)

    Kok, K.D.; Klingensmith, R.W.; Riffle, W.; Carver, R.D.; Schumacher, S.S.; Lavender, J.C.; Smith, B.W.

    1988-01-01

    Performance tests of MC and A systems using computers are described, and some example scenarios are discussed. Facility personnel or an external inspection group may use performance tests to evaluate MC and A capabilities under actual operating conditions. This paper addresses test elements common to both user groups. Performance tests consist of scenario development, security, and safety measures, conduct, and evaluation. Because exercises often test scenarios against which the MC and A system is protecting, they may require additional security and safety measures. Computer systems enhance MC and A by providing greater speed, increased precision, and automated checks. Performance tests for MC and A systems using computers differ from performance tests employing manual data systems in that the scenarios must anticipate computer actions rather than utilize controllers to adjust the scenario

  12. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  13. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  14. Implementation Approach for Plug-in Electric Vehicles at Joint Base Lewis McChord. Task 4

    Energy Technology Data Exchange (ETDEWEB)

    Schey, Stephen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Francfort, Jim [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    This study focused on Joint Base Lewis McChord (JBLM), which is located in Washington State. Task 1 consisted of a survey of the non-tactical fleet of vehicles at JBLM to begin the review of vehicle mission assignments and the types of vehicles in service. In Task 2, daily operational characteristics of select vehicles were identified and vehicle movements were recorded in data loggers in order to characterize the vehicles’ missions. In Task 3, the results of the data analysis and observations were provided. Individual observations of the selected vehicles provided the basis for recommendations related to PEV adoption (i.e., whether a battery electric vehicle or plug-in hybrid electric vehicle [collectively referred to as PEVs] can fulfill the mission requirements0, as well as the basis for recommendations related to placement of PEV charging infrastructure. This report focuses on an implementation plan for the near-term adoption of PEVs into the JBLM fleet.

  15. TU-AB-BRC-02: Accuracy Evaluation of GPU-Based OpenCL Carbon Monte Carlo Package (goCMC) in Biological Dose and Microdosimetry in Comparison to FLUKA Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Peeler, C; Qin, N; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States)

    2016-06-15

    Purpose: One of the most accurate methods for radiation transport is Monte Carlo (MC) simulation. Long computation time prevents its wide applications in clinic. We have recently developed a fast MC code for carbon ion therapy called GPU-based OpenCL Carbon Monte Carlo (goCMC) and its accuracy in physical dose has been established. Since radiobiology is an indispensible aspect of carbon ion therapy, this study evaluates accuracy of goCMC in biological dose and microdosimetry by benchmarking it with FLUKA. Methods: We performed simulations of a carbon pencil beam with 150, 300 and 450 MeV/u in a homogeneous water phantom using goCMC and FLUKA. Dose and energy spectra for primary and secondary ions on the central beam axis were recorded. Repair-misrepair-fixation model was employed to calculate Relative Biological Effectiveness (RBE). Monte Carlo Damage Simulation (MCDS) tool was used to calculate microdosimetry parameters. Results: Physical dose differences on the central axis were <1.6% of the maximum value. Before the Bragg peak, differences in RBE and RBE-weighted dose were <2% and <1%. At the Bragg peak, the differences were 12.5% caused by small range discrepancy and sensitivity of RBE to beam spectra. Consequently, RBE-weighted dose difference was 11%. Beyond the peak, RBE differences were <20% and primarily caused by differences in the Helium-4 spectrum. However, the RBE-weighted dose agreed within 1% due to the low physical dose. Differences in microdosimetric quantities were small except at the Bragg peak. The simulation time per source particle with FLUKA was 0.08 sec, while goCMC was approximately 1000 times faster. Conclusion: Physical doses computed by FLUKA and goCMC were in good agreement. Although relatively large RBE differences were observed at and beyond the Bragg peak, the RBE-weighted dose differences were considered to be acceptable.

  16. Dependency of non-homogeneity energy dispersion on absorbance line-shape of luminescent polymers

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Marcelo Castanheira da, E-mail: mar_castanheira@yahoo.com.br [Centro de Ciências Biológicas e da Natureza, Universidade Federal do Acre, CP 500, 69915-900 Rio Branco, AC (Brazil); Instituto de Física, Universidade Federal de Uberlândia, CP 593, 38400-902 Uberlândia, MG (Brazil); Santos Silva, H.; Silva, R.A.; Marletta, Alexandre [Instituto de Física, Universidade Federal de Uberlândia, CP 593, 38400-902 Uberlândia, MG (Brazil)

    2013-01-16

    In this paper, we study the importance of the non-homogeneity energy dispersion on absorption line-shape of luminescent polymers. The optical transition probability was calculated based on the molecular exciton model, Franck–Condon states, Gaussian distribution of non-entangled chains with conjugate degree n, semi-empirical parameterization of energy gap, electric dipole moment, and electron-vibrational mode coupling. Based on the approach of the energy gap functional dependence 1/n, the inclusion of the non-homogeneity energy dispersion 1/n{sup 2} is essential to obtain good experimental data agreement, mainly, where the absorption spectra display peaks width of about 65 meV. For unresolved absorption spectra, such as those observed for a large number of conjugated polymers processed via spin-coating technique, for example, the non-homogeneity energy dispersion parameterization is not significant. Results were supported by the application of the model for poly (p-phenylene vinylene) films.

  17. Rolling cycle amplification based single-color quantum dots–ruthenium complex assembling dyads for homogeneous and highly selective detection of DNA

    Energy Technology Data Exchange (ETDEWEB)

    Su, Chen; Liu, Yufei; Ye, Tai; Xiang, Xia; Ji, Xinghu; He, Zhike, E-mail: zhkhe@whu.edu.cn

    2015-01-01

    Graphical abstract: A universal, label-free, homogeneous, highly sensitive, and selective fluorescent biosensor for DNA detection is developed by using rolling-circle amplification (RCA) based single-color quantum dots–ruthenium complex (QDs–Ru) assembling dyads. - Highlights: • The single-color QDs–Ru assembling dyads were applied in homogeneous DNA assay. • This biosensor exhibited high selectivity against base mismatched sequences. • This biosensor could be severed as universal platform for the detection of ssDNA. • This sensor could be used to detect the target in human serum samples. • This DNA sensor had a good selectivity under the interference of other dsDNA. - Abstract: In this work, a new, label-free, homogeneous, highly sensitive, and selective fluorescent biosensor for DNA detection is developed by using rolling-circle amplification (RCA) based single-color quantum dots–ruthenium complex (QDs–Ru) assembling dyads. This strategy includes three steps: (1) the target DNA initiates RCA reaction and generates linear RCA products; (2) the complementary DNA hybridizes with the RCA products to form long double-strand DNA (dsDNA); (3) [Ru(phen){sub 2}(dppx)]{sup 2+} (dppx = 7,8-dimethyldipyrido [3,2-a:2′,3′-c] phenanthroline) intercalates into the long dsDNA with strong fluorescence emission. Due to its strong binding propensity with the long dsDNA, [Ru(phen){sub 2}(dppx)]{sup 2+} is removed from the surface of the QDs, resulting in restoring the fluorescence of the QDs, which has been quenched by [Ru(phen){sub 2}(dppx)]{sup 2+} through a photoinduced electron transfer process and is overlaid with the fluorescence of dsDNA bonded Ru(II) polypyridyl complex (Ru-dsDNA). Thus, high fluorescence intensity is observed, and is related to the concentration of target. This sensor exhibits not only high sensitivity for hepatitis B virus (HBV) ssDNA with a low detection limit (0.5 pM), but also excellent selectivity in the complex matrix. Moreover

  18. Collision-free gases in spatially homogeneous space-times

    International Nuclear Information System (INIS)

    Maartens, R.; Maharaj, S.D.

    1985-01-01

    The kinematical and dynamical properties of one-component collision-free gases in spatially homogeneous, locally rotationally symmetric (LRS) space-times are analyzed. Following Ray and Zimmerman [Nuovo Cimento B 42, 183 (1977)], it is assumed that the distribution function f of the gas inherits the symmetry of space-time, in order to construct solutions of Liouville's equation. The redundancy of their further assumption that f be based on Killing vector constants of the motion is shown. The Ray and Zimmerman results for Kantowski--Sachs space-time are extended to all spatially homogeneous LRS space-times. It is shown that in all these space-times the kinematic average four-velocity u/sup i/ can be tilted relative to the homogeneous hypersurfaces. This differs from the perfect fluid case, in which only one space-time admits tilted u/sup i/, as shown by King and Ellis [Commun. Math. Phys. 31, 209 (1973)]. As a consequence, it is shown that all space-times admit nonzero acceleration and heat flow, while a subclass admits nonzero vorticity. The stress π/sub i/j is proportional to the shear sigma/sub i/j by virtue of the invariance of the distribution function. The evolution of tilt and the existence of perfect fluid solutions is also discussed

  19. McMAC: Towards a MAC Protocol with Multi-Constrained QoS Provisioning for Diverse Traffic in Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Muhammad Mostafa Monowar

    2012-11-01

    Full Text Available The emergence of heterogeneous applications with diverse requirements forresource-constrained Wireless Body Area Networks (WBANs poses significant challengesfor provisioning Quality of Service (QoS with multi-constraints (delay and reliability whilepreserving energy efficiency. To address such challenges, this paper proposes McMAC,a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes inWBANs. McMAC classifies traffic based on their multi-constrained QoS demands andintroduces a novel superframe structure based on the "transmit-whenever-appropriate"principle, which allows diverse periods for diverse traffic classes according to their respectiveQoS requirements. Furthermore, a novel emergency packet handling mechanism is proposedto ensure packet delivery with the least possible delay and the highest reliability. McMACis also modeled analytically, and extensive simulations were performed to evaluate itsperformance. The results reveal that McMAC achieves the desired delay and reliabilityguarantee according to the requirements of a particular traffic class while achieving energyefficiency.

  20. McMAC: towards a MAC protocol with multi-constrained QoS provisioning for diverse traffic in Wireless Body Area Networks.

    Science.gov (United States)

    Monowar, Muhammad Mostafa; Hassan, Mohammad Mehedi; Bajaber, Fuad; Al-Hussein, Musaed; Alamri, Atif

    2012-11-12

    The emergence of heterogeneous applications with diverse requirements for resource-constrained Wireless Body Area Networks (WBANs) poses significant challenges for provisioning Quality of Service (QoS) with multi-constraints (delay and reliability) while preserving energy efficiency. To address such challenges, this paper proposes McMAC,a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes in WBANs. McMAC classifies traffic based on their multi-constrained QoS demands and introduces a novel superframe structure based on the "transmit-whenever-appropriate"principle, which allows diverse periods for diverse traffic classes according to their respective QoS requirements. Furthermore, a novel emergency packet handling mechanism is proposedto ensure packet delivery with the least possible delay and the highest reliability. McMAC is also modeled analytically, and extensive simulations were performed to evaluate its performance. The results reveal that McMAC achieves the desired delay and reliability guarantee according to the requirements of a particular traffic class while achieving energy efficiency.

  1. Study on development of ASCFR1.0/MC and initial calculation of moderator temperature effect for ASCFR

    International Nuclear Information System (INIS)

    Li Zhifeng; Yu Tao; Xie Jinsen

    2013-01-01

    In order to develop the temperature-dependent point-wise cross section library for the advanced supercritical water cooled fast reactor, the JEZEBEL fast neutron benchmark was used to analyze the important parameters of the NJOY code and compare the different effects of the input parameters. Then the most reasonable parameters were selected to develop the ASCFR1.0/MC which was based on the ENDF/B-VII.1. Finally, the Doppler coefficient benchmark was applied to test and verify the ASCFR.10/MC. In conclusion, the precision of the ASCFR1.0/MC were perfect. The resulted library can be used in the analysis and verification of the temperature effect in the Advanced Supercritical Fast Reactor (ASCFR), Finally, the moderator effect of ASCFR was calculated with MCNP using the ASCFR1.0/MC library, and the moderator effect of the ASCFR is positive. (authors)

  2. Entrevista com Eric McLuhan

    Directory of Open Access Journals (Sweden)

    McLuhan, Eric

    2011-01-01

    Full Text Available Marshall McLuhan faria 100 anos em 2011. Para marcar esta data, que já provocou uma recente revisitação à obra do teórico, o XI Seminário Internacional da Comunicação, promovido pela PUCRS, destacou temas como as relações do homem com as tecnologias e os efeitos psicológicos das mídias para nortear às discussões dos três dias de evento. Mas se engana que acredita que entender a obra mcluhiana é tarefa fácil. Mesmo tendo se dedicado com afinco a explicar metáforas como ‘o meio é a mensagem’ e ‘aldeia global’, McLuhan nunca chegou a ser completamente entendido. Ou as pessoas não sedavam ao trabalho de entendê-lo, ou, simplesmente, não conseguiam enxergar as mudanças que estavam acontecendo com a mesma clareza que o pesquisador. Numa tentativa não só de entender a obra do ‘profeta da globalização’, mas de ampliar os entendimentos sobre as tecnologias e as extensões do homem, Eric McLuhan, filho do teórico, participou como palestrante do Seminário para falar, entre outros assuntos, da percepção acadêmica quanto a obra de seu pai, além de introduzir novas ideias sobre televisão, globalização e ecologia

  3. Altered regional homogeneity of spontaneous brain activity in idiopathic trigeminal neuralgia

    Directory of Open Access Journals (Sweden)

    Wang Y

    2015-10-01

    Full Text Available Yanping Wang,1,2 Xiaoling Zhang,2 Qiaobing Guan,2 Lihong Wan,2 Yahui Yi,2 Chun-Feng Liu1 1Department of Neurology, The Second Affiliated Hospital of Soochow University, Suzhou, Jiangsu Province, 2Department of Neurology, The Second Hospital of Jiaxing City, Jiaxing, Zhejiang Province, People’s Republic of China Abstract: The pathophysiology of idiopathic trigeminal neuralgia (ITN has conventionally been thought to be induced by neurovascular compression theory. Recent structural brain imaging evidence has suggested an additional central component for ITN pathophysiology. However, far less attention has been given to investigations of the basis of abnormal resting-state brain activity in these patients. The objective of this study was to investigate local brain activity in patients with ITN and its correlation with clinical variables of pain. Resting-state functional magnetic resonance imaging data from 17 patients with ITN and 19 age- and sex-matched healthy controls were analyzed using regional homogeneity (ReHo analysis, which is a data-driven approach used to measure the regional synchronization of spontaneous brain activity. Patients with ITN had decreased ReHo in the left amygdala, right parahippocampal gyrus, and left cerebellum and increased ReHo in the right inferior temporal gyrus, right thalamus, right inferior parietal lobule, and left postcentral gyrus (corrected. Furthermore, the increase in ReHo in the left precentral gyrus was positively correlated with visual analog scale (r=0.54; P=0.002. Our study found abnormal functional homogeneity of intrinsic brain activity in several regions in ITN, suggesting the maladaptivity of the process of daily pain attacks and a central role for the pathophysiology of ITN. Keywords: trigeminal neuralgia, resting fMRI, brain, chronic pain, local connectivity

  4. A personal view on homogenization

    International Nuclear Information System (INIS)

    Tartar, L.

    1987-02-01

    The evolution of some ideas is first described. Under the name homogenization are collected all the mathematical results who help understanding the relations between the microstructure of a material and its macroscopic properties. Homogenization results are given through a critically detailed bibliography. The mathematical models given are systems of partial differential equations, supposed to describe some properties at a scale ε and we want to understand what will happen to the solutions if ε tends to 0

  5. ATLAS Monte Carlo tunes for MC09

    CERN Document Server

    The ATLAS collaboration

    2010-01-01

    This note describes the ATLAS tunes of underlying event and minimum bias description for the main Monte Carlo generators used in the MC09 production. For the main shower generators, pythia and herwig (with jimmy), the MRST LO* parton distribution functions (PDFs) were used for the first time in ATLAS. Special studies on the performance of these, conceptually new, PDFs for high pt physics processes at LHC energies are presented. In addition, a tune of jimmy for CTEQ6.6 is presented, for use with MC@NLO.

  6. Mickey Mouse poses with a portrait of Ronald McNair

    Science.gov (United States)

    1999-01-01

    In the gymnasium of Ronald McNair Magnet School in Cocoa, Fla., Mickey Mouse poses with a portrait of NASA astronaut Ronald McNair. The portrait was presented to the school by Walt Disney World during a tribute to McNair. The school had previously been renamed for the fallen astronaut who was one of a crew of seven who lost their lives during an accident following launch of the Space Shuttle Challenger in January 1986.

  7. The great war correspondent: Francis McCullagh, 1874–1956

    OpenAIRE

    Horgan, John

    2009-01-01

    Trotsky of Russia knows Francis McCullagh. So does President Calles of Mexico. Peter, the King of Serbia, was McCullagh’s friend. The headhunters of the upper Amazon list Francis McCullagh as one of their principal deities. The warring tribes of Morocco call him blood brother. A room is always ready for him in the imperial palace of Siam. The latchstrings of hundreds of Siberian peasant huts are out in anticipation of his coming.

  8. Esther McCready, RN: Nursing Advocate for Civil Rights

    Science.gov (United States)

    Pollitt, Phoebe A

    2016-02-15

    More than a decade before the Civil Rights Act of 1964, as an African American teenager from Baltimore, Maryland, Esther McCready challenged the discriminatory admissions policies of the University of Maryland School of Nursing (UMSON). The article explores nurse advocacy and how Esther McCready advocated for herself and greater racial equity in nursing education during a time of civil rights turmoil. Her actions eventually resulted in the formation of numerous schools of nursing for African Americans across the south. This article recounts McCready’s early life experiences and the powerful impact her actions had on creating educational options for nurses during a time when they were severely limited for African American women, including discussion of her student days at UMSON and her journey after nursing school. A review of pertinent legal cases and policies related to segregation and integration of higher education in the mid-twentieth century is presented, along with details of McCready’s continued education and advocacy.

  9. Graphical user interfaces for McCellan Nuclear Radiation Center (MNRC)

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1998-01-01

    McClellan's Nuclear Radiation Center (MNRC) control console is in the process of being replaced due to spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and incorporating human factors during all stages of the graphical user interface (GUI) development and control console design

  10. Differential Muscle Involvement in Mice and Humans Affected by McArdle Disease

    DEFF Research Database (Denmark)

    Krag, Thomas O; Pinós, Tomàs; Nielsen, Tue L

    2016-01-01

    McArdle disease (muscle glycogenosis type V) is caused by myophosphorylase deficiency, which leads to impaired glycogen breakdown. We investigated how myophosphorylase deficiency affects muscle physiology, morphology, and glucose metabolism in 20-week-old McArdle mice and compared the findings...... to those in McArdle disease patients. Muscle contractions in the McArdle mice were affected by structural degeneration due to glycogen accumulation, and glycolytic muscles fatigued prematurely, as occurs in the muscles of McArdle disease patients. Homozygous McArdle mice showed muscle fiber disarray...... no substitution for the missing muscle isoform. In the mice, the tibialis anterior (TA) muscles were invariably more damaged than the quadriceps muscles. This may relate to a 7-fold higher level of myophosphorylase in TA compared to quadriceps in wild-type mice and suggests higher glucose turnover in the TA. Thus...

  11. A homogeneous cooling scheme investigation for high power slab laser

    Science.gov (United States)

    He, Jianguo; Lin, Weiran; Fan, Zhongwei; Chen, Yanzhong; Ge, Wenqi; Yu, Jin; Liu, Hao; Mo, Zeqiang; Fan, Lianwen; Jia, Dan

    2017-10-01

    The forced convective heat transfer with the advantages of reliability and durability is widely used in cooling the laser gain medium. However, a flow direction induced temperature gradient always appears. In this paper, a novel cooling configuration based on longitudinal forced convective heat transfer is presented. In comparison with two different types of configurations, it shows a more efficient heat transfer and more homogeneous temperature distribution. The investigation of the flow rate reveals that the higher flow rate the better cooling performance. Furthermore, the simulation results with 20 L/min flow rate shows an adequate temperature level and temperature homogeneity which keeps a lower hydrostatic pressure in the flow path.

  12. Toehold-mediated nonenzymatic amplification circuit on graphene oxide fluorescence switching platform for sensitive and homogeneous microRNA detection

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Ru; Liao, Yuhui; Zhou, Xiaoming, E-mail: zhouxm@scnu.edu.cn; Xing, Da, E-mail: xingda@scnu.edu.cn

    2015-08-12

    A novel graphene oxide (GO) fluorescence switch-based homogenous system has been developed to solve two problems that are commonly encountered in conventional GO-based biosensors. First, with the assistance of toehold-mediated nonenzymatic amplification (TMNA), the sensitivity of this system greatly surpasses that of previously described GO-based biosensors, which are always limited to the nM range due to the lack of efficient signal amplification. Second, without enzymatic participation in amplification, the unreliability of detection resulting from nonspecific desorption of DNA probes on the GO surface by enzymatic protein can be avoided. Moreover, the interaction mechanism of the double-stranded TMNA products contains several single-stranded toeholds at two ends and GO has also been explored with combinations of atomic force microscopy imaging, zeta potential detection, and fluorescence assays. It has been shown that the hybrids can be anchored to the surface of GO through the end with more unpaired bases, and that the other end, which has weaker interaction with GO, can escape GO adsorption due to the robustness of the central dsDNA structures. We verified this GO fluorescence switch-based detection system by detecting microRNA 21, an overexpressed non-encoding gene in a variety of malignant cells. Rational design of the probes allowed the isothermal nonenzymatic reaction to achieve more than 100-fold amplification efficiency. The detection results showed that our strategy has a detection limit of 10 pM and a detection range of four orders of magnitude. - Highlights: • This paper explored the interaction mechanism of TMNA products with GO surface. • This homogeneous and isothermal system permits a detection limit of 10 pM for microRNA. • This nonenzymatic strategy can avoid nonspecific desorption caused by enzyme protein. • The interaction model can be used to explore the application ability of nonenzymatic circuit.

  13. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  14. Obama ja McCaini lugu Ameerika presidendivalimistel : kes võidab? / Jonatan Vseviov

    Index Scriptorium Estoniae

    Vseviov, Jonatan

    2008-01-01

    Autor analüüsib USA presidendikandidaatide Barack Obama ja John McCaini valimiskampaaniat, sellel ajal esitatavat lugu minevikust ja tulevikust, kus keskseks tegelaseks on kandidaat ise ning asepresidendikandidaatide valikut. Obamal on muutuste narratiiv - soov muuta senist poliitikat, McCainil aga narratiiv kogemustest ja sõltumatusest. Vt. samas: Barack Obama; John McCain; Hendrik Vosman. Obama valmistub koostama staaride kabinetti; Evelyn Kaldoja. Michelle Obamat võrreldakse Jackie Kennedyga; Missioonitundega õlleprintsess Cindy Hensley McCain

  15. HOMOGENIZERS OF LASER RADIATION FABRICATED BY LASER GRAVING OF POLYMER FILMS

    Directory of Open Access Journals (Sweden)

    V. K. Balya

    2013-01-01

    Full Text Available The article deals with homogenizers of laser radiation based on crossing micro-prism structures and random diffusers by laser ablation of thermo-sensitive film and subsequent nano-imprint copying.

  16. Simulations of NMR pulse sequences during equilibrium and non-equilibrium chemical exchange

    International Nuclear Information System (INIS)

    Helgstrand, Magnus; Haerd, Torleif; Allard, Peter

    2000-01-01

    The McConnell equations combine the differential equations for a simple two-state chemical exchange process with the Bloch differential equations for a classical description of the behavior of nuclear spins in a magnetic field. This equation system provides a useful starting point for the analysis of slow, intermediate and fast chemical exchange studied using a variety of NMR experiments. The McConnell equations are in the mathematical form of an inhomogeneous system of first-order differential equations. Here we rewrite the McConnell equations in a homogeneous form in order to facilitate fast and simple numerical calculation of the solution to the equation system. The McConnell equations can only treat equilibrium chemical exchange. We therefore also present a homogeneous equation system that can handle both equilibrium and non-equilibrium chemical processes correctly, as long as the kinetics is of first-order. Finally, the same method of rewriting the inhomogeneous form of the McConnell equations into a homogeneous form is applied to a quantum mechanical treatment of a spin system in chemical exchange. In order to illustrate the homogeneous McConnell equations, we have simulated pulse sequences useful for measuring exchange rates in slow, intermediate and fast chemical exchange processes. A stopped-flow NMR experiment was simulated using the equations for non-equilibrium chemical exchange. The quantum mechanical treatment was tested by the simulation of a sensitivity enhanced 15 N-HSQC with pulsed field gradients during slow chemical exchange and by the simulation of the transfer efficiency of a two-dimensional heteronuclear cross-polarization based experiment as a function of both chemical shift difference and exchange rate constants

  17. Centralized light-source optical access network based on polarization multiplexing.

    Science.gov (United States)

    Grassi, Fulvio; Mora, José; Ortega, Beatriz; Capmany, José

    2010-03-01

    This paper presents and demonstrates a centralized light source optical access network based on optical polarization multiplexing technique. By using two optical sources emitting light orthogonally polarized in the Central Node for downstream and upstream operations, the Remote Node is kept source-free. EVM values below telecommunication standard requirements have been measured experimentally when bidirectional digital signals have been transmitted over 10 km of SMF employing subcarrier multiplexing technique in the electrical domain.

  18. Third party testing : new pilot facility for mining processes opens in Fort McKay

    International Nuclear Information System (INIS)

    Jaremko, D.

    2007-01-01

    Fort McKay lies 65 kilometres north of Fort McMurray, Alberta and is the centre of operational oilsands mining activity. As such, it was chosen for a pilot testing facility created by the Geneva-based SGS Group. The reputable facility provides an opportunity for mining producers to advance their processes, including environmental performance, by allowing them to test different processes on their own oilsands. The Northern Lights partnership, led by Synenco Energy, was the first client at the facility. Due to outsourcing, clients are not obligated to make substantial capital investment into in-house research. The Northern Lights partnership will be using the facility to test extraction processes on bitumen from its leases. Although the Fort McKay facility is SGS's first venture into the oilsands industry, it operates in more than 140 companies globally, including the mineral industry, and specializes in inspection, verification, testing and certification. SGS took the experience from its minerals extraction business to identify what could be done to help the oilsands industry by using best practices developed from global operations. The facility lies on the Fort McKay industrial park owned by the Fort McKay First Nation. An existing testing facility called McMurray Resources Research and Testing was expanded by the SGS Group to include environmental analysis capabilities. The modular units that lie on 6 acres include refrigerated ore storage to maintain ore integrity; modular ore and materials handling systems; extraction equipment; and, zero discharge process water and waste disposal systems. Froth treatment will be added in the near future to cover the entire upstream side of the mining processing business. A micro-upgrader might be added in the future to manufacture synthetic crude. 3 figs

  19. Third party testing : new pilot facility for mining processes opens in Fort McKay

    Energy Technology Data Exchange (ETDEWEB)

    Jaremko, D.

    2007-12-15

    Fort McKay lies 65 kilometres north of Fort McMurray, Alberta and is the centre of operational oilsands mining activity. As such, it was chosen for a pilot testing facility created by the Geneva-based SGS Group. The reputable facility provides an opportunity for mining producers to advance their processes, including environmental performance, by allowing them to test different processes on their own oilsands. The Northern Lights partnership, led by Synenco Energy, was the first client at the facility. Due to outsourcing, clients are not obligated to make substantial capital investment into in-house research. The Northern Lights partnership will be using the facility to test extraction processes on bitumen from its leases. Although the Fort McKay facility is SGS's first venture into the oilsands industry, it operates in more than 140 companies globally, including the mineral industry, and specializes in inspection, verification, testing and certification. SGS took the experience from its minerals extraction business to identify what could be done to help the oilsands industry by using best practices developed from global operations. The facility lies on the Fort McKay industrial park owned by the Fort McKay First Nation. An existing testing facility called McMurray Resources Research and Testing was expanded by the SGS Group to include environmental analysis capabilities. The modular units that lie on 6 acres include refrigerated ore storage to maintain ore integrity; modular ore and materials handling systems; extraction equipment; and, zero discharge process water and waste disposal systems. Froth treatment will be added in the near future to cover the entire upstream side of the mining processing business. A micro-upgrader might be added in the future to manufacture synthetic crude. 3 figs.

  20. A generalized model for homogenized reflectors

    International Nuclear Information System (INIS)

    Pogosbekyan, Leonid; Kim, Yeong Il; Kim, Young Jin; Joo, Hyung Kook

    1996-01-01

    A new concept of equivalent homogenization is proposed. The concept employs new set of homogenized parameters: homogenized cross sections (XS) and interface matrix (IM), which relates partial currents at the cell interfaces. The idea of interface matrix generalizes the idea of discontinuity factors (DFs), proposed and developed by K. Koebke and K. Smith. The method of K. Smith can be simulated within framework of new method, while the new method approximates hetero-geneous cell better in case of the steep flux gradients at the cell interfaces. The attractive shapes of new concept are:improved accuracy, simplicity of incorporation in the existing codes, equal numerical expenses in comparison to the K. Smith's approach. The new concept is useful for: (a) explicit reflector/baffle simulation; (b)control blades simulation; (c) mixed UO 2 /MOX core simulation. The offered model has been incorporated in the finite difference code and in the nodal code PANBOX. The numerical results show good accuracy of core calculations and insensitivity of homogenized parameters with respect to in-core conditions

  1. Validation of Shielding Analysis Capability of SuperMC with SINBAD

    Directory of Open Access Journals (Sweden)

    Chen Chaobin

    2017-01-01

    Full Text Available Abstract: The shielding analysis capability of SuperMC was validated with the Shielding Integral Benchmark Archive Database (SINBAD. The SINBAD was compiled by RSICC and NEA, it includes numerous benchmark experiments performed with the D-T fusion neutron source facilities of OKTAVIAN, FNS, IPPE, etc. The results from SuperMC simulation were compared with experimental data and MCNP results. Very good agreement with deviation lower than 1% was achieved and it suggests that SuperMC is reliable in shielding calculation.

  2. Dose rates from a C-14 source using extrapolation chamber and MC calculations

    International Nuclear Information System (INIS)

    Borg, J.

    1996-05-01

    The extrapolation chamber technique and the Monte Carlo (MC) calculation technique based on the EGS4 system have been studied for application for determination of dose rates in a low-energy β radiation field e.g., that from a 14 C source. The extrapolation chamber measurement method is the basic method for determination of dose rates in β radiation fields. Applying a number of correction factors and the stopping power ratio, tissue to air, the measured dose rate in an air volume surrounded by tissue equivalent material is converted into dose to tissue. Various details of the extrapolation chamber measurement method and evaluation procedure have been studied and further developed, and a complete procedure for the experimental determination of dose rates from a 14 C source is presented. A number of correction factors and other parameters used in the evaluation procedure for the measured data have been obtained by MC calculations. The whole extrapolation chamber measurement procedure was simulated using the MC method. The measured dose rates showed an increasing deviation from the MC calculated dose rates as the absorber thickness increased. This indicates that the EGS4 code may have some limitations for transport of very low-energy electrons. i.e., electrons with estimated energies less than 10 - 20 keV. MC calculations of dose to tissue were performed using two models: a cylindrical tissue phantom and a computer model of the extrapolation chamber. The dose to tissue in the extrapolation chamber model showed an additional buildup dose compared to the dose in the tissue model. (au) 10 tabs., 11 ills., 18 refs

  3. Dissolution test for homogeneity of mixed oxide fuel pellets

    International Nuclear Information System (INIS)

    Lerch, R.E.

    1979-08-01

    Experiments were performed to determine the relationship between fuel pellet homogeneity and pellet dissolubility. Although, in general, the amount of pellet residue decreased with increased homogeneity, as measured by the pellet figure of merit, the relationship was not absolute. Thus, all pellets with high figure of merit (excellent homogeneity) do not necessarily dissolve completely and all samples that dissolve completely do not necessarily have excellent homogeneity. It was therefore concluded that pellet dissolubility measurements could not be substituted for figure of merit determinations as a measurement of pellet homogeneity. 8 figures, 3 tables

  4. A Test for Parameter Homogeneity in CO{sub 2}Panel EKC Estimations

    Energy Technology Data Exchange (ETDEWEB)

    Dijkgraaf, E. [Erasmus University Rotterdam and SEOR, Rotterdam (Netherlands); Vollebergh, H.R.J. [Department of Economics, Erasmus University Rotterdam, PO Box 1738, 3000 DR Rotterdam (Netherlands)

    2005-10-15

    This paper casts doubt on empirical results based on panel estimations of an 'inverted-U' relationship between per capita GDP and pollution. Using a new dataset for OECD countries on carbon dioxide emissions for the period 1960-1997, we find that the crucial assumption of homogeneity across countries is problematic. Decisively rejected are model specifications that feature even weaker homogeneity assumptions than are commonly used. Furthermore, our results challenge the existence of an overall Environmental Kuznets Curve for carbon dioxide emissions.

  5. Associations of MC1R Genotype and Patient Phenotypes with BRAF and NRAS Mutations in Melanoma.

    Science.gov (United States)

    Thomas, Nancy E; Edmiston, Sharon N; Kanetsky, Peter A; Busam, Klaus J; Kricker, Anne; Armstrong, Bruce K; Cust, Anne E; Anton-Culver, Hoda; Gruber, Stephen B; Luo, Li; Orlow, Irene; Reiner, Anne S; Gallagher, Richard P; Zanetti, Roberto; Rosso, Stefano; Sacchetto, Lidia; Dwyer, Terence; Parrish, Eloise A; Hao, Honglin; Gibbs, David C; Frank, Jill S; Ollila, David W; Begg, Colin B; Berwick, Marianne; Conway, Kathleen

    2017-12-01

    Associations of MC1R with BRAF mutations in melanoma have been inconsistent between studies. We sought to determine for 1,227 participants in the international population-based Genes, Environment, and Melanoma (GEM) study whether MC1R and phenotypes were associated with melanoma BRAF/NRAS subtypes. We used logistic regression adjusted by age, sex, and study design features and examined effect modifications. BRAF + were associated with younger age, blond/light brown hair, increased nevi, and less freckling, and NRAS + with older age relative to the wild type (BRAF - /NRAS - ) melanomas (all P < 0.05). Comparing specific BRAF subtypes to the wild type, BRAF V600E was associated with younger age, blond/light brown hair, and increased nevi and V600K with increased nevi and less freckling (all P < 0.05). MC1R was positively associated with BRAF V600E cases but only among individuals with darker phototypes or darker hair (P interaction < 0.05) but inversely associated with BRAF V600K (P trend  = 0.006) with no significant effect modification by phenotypes. These results support distinct etiologies for BRAF V600E, BRAF V600K, NRAS + , and wild-type melanomas. MC1R's associations with BRAF V600E cases limited to individuals with darker phenotypes indicate that MC1R genotypes specifically provide information about BRAF V600E melanoma risk in those not considered high risk based on phenotype. Our results also suggest that melanin pathways deserve further study in BRAF V600E melanomagenesis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. MC-PDFT can calculate singlet-triplet splittings of organic diradicals

    Science.gov (United States)

    Stoneburner, Samuel J.; Truhlar, Donald G.; Gagliardi, Laura

    2018-02-01

    The singlet-triplet splittings of a set of diradical organic molecules are calculated using multiconfiguration pair-density functional theory (MC-PDFT), and the results are compared with those obtained by Kohn-Sham density functional theory (KS-DFT) and complete active space second-order perturbation theory (CASPT2) calculations. We found that MC-PDFT, even with small and systematically defined active spaces, is competitive in accuracy with CASPT2, and it yields results with greater accuracy and precision than Kohn-Sham DFT with the parent functional. MC-PDFT also avoids the challenges associated with spin contamination in KS-DFT. It is also shown that MC-PDFT is much less computationally expensive than CASPT2 when applied to larger active spaces, and this illustrates the promise of this method for larger diradical organic systems.

  7. Type of homogenization and fat loss during continuous infusion of human milk.

    Science.gov (United States)

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P homogenization is used. © The Author(s) 2014.

  8. Understanding Urban Spatial Structure of Shanghai Central City Based on Mobile Phone Data

    Institute of Scientific and Technical Information of China (English)

    Niu; Xinyi; Ding; Liang; Song; Xiaodong; Zhang; Qingfei

    2015-01-01

    Taking Shanghai Central City as its case study, this paper presents an approach to exploring the urban spatial structure through mobile phone positioning data. Firstly, based on base station location data and mobile phone signaling data, the paper analyses the number of users connecting to each base station, and further generates the maps of mobile phone user density through kernel density analysis. We move on to calculate the multi-day average user density based on a time frame of 10:00 and 23:00 at workdays and 15:00 and 23:00 at weekends for Shanghai Central City. Then, through spatial aggregation and density classifi cation on the density maps of 10:00 at workdays and 15:00 at weekends, we identify the ranks and functions of public centers within Shanghai Central City. Lastly, we identify residential areas, business off ice areas, and leisure areas in Shanghai Central City and measure the degree of functional mix by comparing the ratio of day and night user density as well as the user density at nighttime of workdays and weekends.

  9. Homogenization patterns of the world's freshwater fish faunas.

    Science.gov (United States)

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-11-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the "Homogocene era" is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes.

  10. Homogen Mur - et udviklingsprojekt

    DEFF Research Database (Denmark)

    Dahl, Torben; Beim, Anne; Sørensen, Peter

    1997-01-01

    Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk.......Mølletorvet i Slagelse er det første byggeri i Danmark, hvor ydervæggen er udført af homogene bærende og isolerende teglblokke. Byggeriet viser en række af de muligheder, der både med hensyn til konstruktioner, energiforhold og arkitektur ligger i anvendelsen af homogent blokmurværk....

  11. Effect of heat and homogenization on in vitro digestion of milk.

    Science.gov (United States)

    Tunick, Michael H; Ren, Daxi X; Van Hekken, Diane L; Bonnaillie, Laetitia; Paul, Moushumi; Kwoczak, Raymond; Tomasula, Peggy M

    2016-06-01

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. Ultra-high-temperature sterilization is also applied to milk and is typically used to extend the shelf life of refrigerated, specialty milk products or to provide shelf-stable milk. The structures of the milk proteins and lipids are affected by processing but little information is available on the effects of the individual processes or sequences of processes on digestibility. In this study, raw whole milk was subjected to homogenization, HTST pasteurization, and homogenization followed by HTST or UHT processing. Raw skim milk was subjected to the same heating regimens. In vitro gastrointestinal digestion using a fasting model was then used to detect the processing-induced changes in the proteins and lipids. Using sodium dodecyl sulfate-PAGE, gastric pepsin digestion of the milk samples showed rapid elimination of the casein and α-lactalbumin bands, persistence of the β-lactoglobulin bands, and appearance of casein and whey peptide bands. The bands for β-lactoglobulin were eliminated within the first 15min of intestinal pancreatin digestion. The remaining proteins and peptides of raw, HTST, and UHT skim samples were digested rapidly within the first 15min of intestinal digestion, but intestinal digestion of raw and HTST pasteurized whole milk showed some persistence of the peptides throughout digestion. The availability of more lipid droplets upon homogenization, with greater surface area available for interaction with the peptides, led to persistence of the smaller peptide bands and thus slower intestinal digestion when followed by HTST pasteurization but not by UHT processing, in which the denatured proteins may be more accessible to the digestive enzymes. Homogenization and heat processing also affected the ζ-potential and free fatty acid release

  12. Autoradiographic imaging of the serotonin transporter, using S-[18F](fluoromethyl)-(+)-McN5652 ([18F]Me-McN) in the brains of several animal species

    International Nuclear Information System (INIS)

    Kretzschmar, M.; Zessin, J.; Brust, P.; Cumming, P.; Bergmann, R.

    2002-01-01

    The [ 18 F]fluoromethyl analogue of (+)-McN5652 ([ 18 F]Me-McN) was recently proposed as a new potential PET tracer [1]. To further validate its use in PET, we studied the binding of [ 18 F]Me-McN in the brains of rats and pigs using autoradiography. The binding was compared with the uptake of the known 5-HT uptake inhibitor [ 3 H] citalopram [2] and the radioligand (+)-[ 11 C]McN5652. The binding of the three compounds was qualitatively identical in the autoradiograms of the individual brains. Intense labelling was observed in regions known to be serotonin uptake sites. The binding was specifically inhibited, using the 5-HT uptake inhibitors citalopram and fluoxetine. (orig.)

  13. Homogenous polynomially parameter-dependent H∞ filter designs of discrete-time fuzzy systems.

    Science.gov (United States)

    Zhang, Huaguang; Xie, Xiangpeng; Tong, Shaocheng

    2011-10-01

    This paper proposes a novel H(∞) filtering technique for a class of discrete-time fuzzy systems. First, a novel kind of fuzzy H(∞) filter, which is homogenous polynomially parameter dependent on membership functions with an arbitrary degree, is developed to guarantee the asymptotic stability and a prescribed H(∞) performance of the filtering error system. Second, relaxed conditions for H(∞) performance analysis are proposed by using a new fuzzy Lyapunov function and the Finsler lemma with homogenous polynomial matrix Lagrange multipliers. Then, based on a new kind of slack variable technique, relaxed linear matrix inequality-based H(∞) filtering conditions are proposed. Finally, two numerical examples are provided to illustrate the effectiveness of the proposed approach.

  14. Kinematic analysis of mélange fabrics: examples and applications from the McHugh Complex, Kenai Peninsula, Alaska

    Science.gov (United States)

    Kusky, Timothy M.; Bradley, Dwight C.

    1999-12-01

    Permian to Cretaceous mélange of the McHugh Complex on the Kenai Peninsula, south-central Alaska includes blocks and belts of graywacke, argillite, limestone, chert, basalt, gabbro, and ultramafic rocks, intruded by a variety of igneous rocks. An oceanic plate stratigraphy is repeated hundreds of times across the map area, but most structures at the outcrop scale extend lithological layering. Strong rheological units occur as blocks within a matrix that flowed around the competent blocks during deformation, forming broken formation and mélange. Deformation was noncoaxial, and disruption of primary layering was a consequence of general strain driven by plate convergence in a relatively narrow zone between the overriding accretionary wedge and the downgoing, generally thinly sedimented oceanic plate. Soft-sediment deformation processes do not appear to have played a major role in the formation of the mélange. A model for deformation at the toe of the wedge is proposed in which layers oriented at low angles to σ1 are contracted in both the brittle and ductile regimes, layers at 30-45° to σ1 are extended in the brittle regime and contracted in the ductile regime, and layers at angles greater than 45° to σ1 are extended in both the brittle and ductile regimes. Imbrication in thrust duplexes occurs at deeper levels within the wedge. Many structures within mélange of the McHugh Complex are asymmetric and record kinematic information consistent with the inferred structural setting in an accretionary wedge. A displacement field for the McHugh Complex on the lower Kenai Peninsula includes three belts: an inboard belt of Late Triassic rocks records west-to-east-directed slip of hanging walls, a central belt of predominantly Early Jurassic rocks records north-south directed displacements, and Early Cretaceous rocks in an outboard belt preserve southwest-northeast directed slip vectors. Although precise ages of accretion are unknown, slip directions are compatible with

  15. VERA Pin and Fuel Assembly Depletion Benchmark Calculations by McCARD and DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin; Cho, Jin Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Monte Carlo (MC) codes have been developed and used to simulate a neutron transport since MC method was devised in the Manhattan project. Solving the neutron transport problem with the MC method is simple and straightforward to understand. Because there are few essential approximations for the 6- dimension phase of a neutron such as the location, energy, and direction in MC calculations, highly accurate solutions can be obtained through such calculations. In this work, the VERA pin and fuel assembly (FA) depletion benchmark calculations are performed to examine the depletion capability of the newly generated DeCART multi-group cross section library. To obtain the reference solutions, MC depletion calculations are conducted using McCARD. Moreover, to scrutinize the effect by stochastic uncertainty propagation, uncertainty propagation analyses are performed using a sensitivity and uncertainty (S/U) analysis method and stochastic sampling (S.S) method. It is still expensive and challenging to perform a depletion analysis by a MC code. Nevertheless, many studies and works for a MC depletion analysis have been conducted to utilize the benefits of the MC method. In this study, McCARD MC and DeCART MOC transport calculations are performed for the VERA pin and FA depletion benchmarks. The DeCART depletion calculations are conducted to examine the depletion capability of the newly generated multi-group cross section library. The DeCART depletion calculations give excellent agreement with the McCARD reference one. From the McCARD results, it is observed that the MC depletion results depend on how to split the burnup interval. First, only to quantify the effect of the stochastic uncertainty propagation at 40 DTS, the uncertainty propagation analyses are performed using the S/U and S.S. method.

  16. The impact of a new McDonald's restaurant on eating behaviours and perceptions of local residents: A natural experiment using repeated cross-sectional data.

    Science.gov (United States)

    Thornton, Lukar E; Ball, Kylie; Lamb, Karen E; McCann, Jennifer; Parker, Kate; Crawford, David A

    2016-05-01

    Neighbourhood food environments are posited as an important determinant of eating behaviours; however causality is difficult to establish based on existing studies. Using a natural experiment study design (incorporating repeated cross-sectional data), we tested whether the development of a new McDonald's restaurant increased the frequency of consumption of McDonald's products amongst local residents in the suburbs of Tecoma (site of a new McDonald's restaurant development) and Monbulk (control site) in Victoria, Australia. Across both sites, the reported frequency of McDonald's consumption did not change during the follow-up surveys. In the context explored, the development of a new McDonald's restaurant has not resulted in an increased consumption of McDonald's products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Tiempo y situacionalidad. La “respuesta” merleaupontiana a la paradoja de McTaggart

    Directory of Open Access Journals (Sweden)

    Claudio Cormick

    2014-01-01

    Full Text Available Se busca establecer una relación, no satisfactoriamente explorada, entre la fenomenología merleaupontiana del tiempo y un problema central de la “theory of time” analítica, la paradoja de McTaggart. Al clarificar, en polémica con Priest (1998, el auténtico sentido del “subjetivismo” merleaupontiano con respecto al tiempo, se señala cómo establecer una confluencia entre el acercamiento fenomenológico y las tesis desarrolladas por Michael Dummett como respuesta a la mencionada paradoja. Con los señalamientos de Dummett y la interpretación de Bimbenet acerca del “perspectivismo” merleaupontiano, se intenta una solución “situacional” a la paradoja.

  18. 78 FR 17646 - Agency Information Collection Activities; Comment Request; Ronald E. McNair Postbaccalaureate...

    Science.gov (United States)

    2013-03-22

    ...; Comment Request; Ronald E. McNair Postbaccalaureate Achievement Program Annual Performance Report AGENCY... of Collection: Ronald E. McNair Postbaccalaureate Achievement Program Annual Performance Report. OMB...: Ronald E. McNair Postbaccalaureate Achievement (McNair) Program Annual Performance Report Program...

  19. MC1R and the response of melanocytes to ultraviolet radiation

    International Nuclear Information System (INIS)

    Rouzaud, Francois; Kadekaro, Ana Luisa; Abdel-Malek, Zalfa A.; Hearing, Vincent J.

    2005-01-01

    The constitutive color of our skin plays a dramatic role in our photoprotection from solar ultraviolet radiation (UVR) that reaches the Earth and in minimizing DNA damage that gives rise to skin cancer. More than 120 genes have been identified and shown to regulate pigmentation, one of the key genes being melanocortin 1 receptor (MC1R) that encodes the melanocortin 1 receptor (MC1R), a seven-transmembrane G protein-coupled receptor expressed on the surface of melanocytes. Modulation of MC1R function regulates melanin synthesis by melanocytes qualitatively and quantitatively. The MC1R is regulated by the physiological agonists α-melanocyte-stimulating hormone (αMSH) and adrenocorticotropic hormone (ACTH), and antagonist agouti signaling protein (ASP). Activation of the MC1R by binding of an agonist stimulates the synthesis of eumelanin primarily via activation of adenylate cyclase. The significance of cutaneous pigmentation lies in the photoprotective effect of melanin, particularly eumelanin, against sun-induced carcinogenesis. Epidermal melanocytes and keratinocytes respond to UVR by increasing their expression of αMSH and ACTH, which up-regulate the expression of MC1R, and consequently enhance the response of melanocytes to melanocortins. Constitutive skin pigmentation dramatically affects the incidence of skin cancer. The pigmentary phenotype characterized by red hair, fair complexion, inability to tan and tendency to freckle is an independent risk factor for all skin cancers, including melanoma. The MC1R gene is highly polymorphic in human populations, and allelic variation at this locus accounts, to a large extent, for the variation in pigmentary phenotypes and skin phototypes (SPT) in humans. Several allelic variants of the MC1R gene are associated with the red hair and fair skin (RHC) phenotype, and carrying one of these variants is thought to diminish the ability of the epidermis to respond to DNA damage elicited by UVR. The MC1R gene is considered a

  20. MC1R and the response of melanocytes to ultraviolet radiation

    Energy Technology Data Exchange (ETDEWEB)

    Rouzaud, Francois [Laboratory of Cell Biology, National Cancer Institute, National Institutes of Health, Building 37, Room 2132, Bethesda, MD 20892 (United States); Kadekaro, Ana Luisa [Department of Dermatology, University of Cincinnati, College of Medicine, Cincinnati, OH 45267 (United States); Abdel-Malek, Zalfa A. [Department of Dermatology, University of Cincinnati, College of Medicine, Cincinnati, OH 45267 (United States); Hearing, Vincent J. [Laboratory of Cell Biology, National Cancer Institute, National Institutes of Health, Building 37, Room 2132, Bethesda, MD 20892 (United States)]. E-mail: hearingv@nih.gov

    2005-04-01

    The constitutive color of our skin plays a dramatic role in our photoprotection from solar ultraviolet radiation (UVR) that reaches the Earth and in minimizing DNA damage that gives rise to skin cancer. More than 120 genes have been identified and shown to regulate pigmentation, one of the key genes being melanocortin 1 receptor (MC1R) that encodes the melanocortin 1 receptor (MC1R), a seven-transmembrane G protein-coupled receptor expressed on the surface of melanocytes. Modulation of MC1R function regulates melanin synthesis by melanocytes qualitatively and quantitatively. The MC1R is regulated by the physiological agonists {alpha}-melanocyte-stimulating hormone ({alpha}MSH) and adrenocorticotropic hormone (ACTH), and antagonist agouti signaling protein (ASP). Activation of the MC1R by binding of an agonist stimulates the synthesis of eumelanin primarily via activation of adenylate cyclase. The significance of cutaneous pigmentation lies in the photoprotective effect of melanin, particularly eumelanin, against sun-induced carcinogenesis. Epidermal melanocytes and keratinocytes respond to UVR by increasing their expression of {alpha}MSH and ACTH, which up-regulate the expression of MC1R, and consequently enhance the response of melanocytes to melanocortins. Constitutive skin pigmentation dramatically affects the incidence of skin cancer. The pigmentary phenotype characterized by red hair, fair complexion, inability to tan and tendency to freckle is an independent risk factor for all skin cancers, including melanoma. The MC1R gene is highly polymorphic in human populations, and allelic variation at this locus accounts, to a large extent, for the variation in pigmentary phenotypes and skin phototypes (SPT) in humans. Several allelic variants of the MC1R gene are associated with the red hair and fair skin (RHC) phenotype, and carrying one of these variants is thought to diminish the ability of the epidermis to respond to DNA damage elicited by UVR. The MC1R gene is