WorldWideScience

Sample records for model reproduces broad

  1. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  2. Reproducing the hierarchy of disorder for Morpho-inspired, broad-angle color reflection

    Science.gov (United States)

    Song, Bokwang; Johansen, Villads Egede; Sigmund, Ole; Shin, Jung H.

    2017-04-01

    The scales of Morpho butterflies are covered with intricate, hierarchical ridge structures that produce a bright, blue reflection that remains stable across wide viewing angles. This effect has been researched extensively, and much understanding has been achieved using modeling that has focused on the positional disorder among the identical, multilayered ridges as the critical factor for producing angular independent color. Realizing such positional disorder of identical nanostructures is difficult, which in turn has limited experimental verification of different physical mechanisms that have been proposed. In this paper, we suggest an alternative model of inter-structural disorder that can achieve the same broad-angle color reflection, and is applicable to wafer-scale fabrication using conventional thin film technologies. Fabrication of a thin film that produces pure, stable blue across a viewing angle of more than 120 ° is demonstrated, together with a robust, conformal color coating.

  3. Reproducing the hierarchy of disorder for Morpho-inspired, broad-angle color reflection

    DEFF Research Database (Denmark)

    Song, Bokwang; Johansen, Villads Egede; Sigmund, Ole

    2017-01-01

    The scales of Morpho butterflies are covered with intricate, hierarchical ridge structures that produce a bright, blue reflection that remains stable across wide viewing angles. This effect has been researched extensively, and much understanding has been achieved using modeling that has focused o...

  4. Modelling Demand for Broad Money in Australia

    OpenAIRE

    Abbas Valadkhani

    2002-01-01

    The existence of a stable demand for money is very important for the conduct of monetary policy. It is argued that previous work on the demand for money in Australia has not been very satisfactory in a number of ways. This paper examines the long- and short-run determinants of the demand for broad money employing the Johansen cointegration technique and a short-run dynamic model. Using quarterly data for the period 1976:3-2002:2, this paper finds, inter alia, that the demand for broad money i...

  5. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  6. Photoionisation modelling of the broad line region

    Science.gov (United States)

    King, Anthea

    2016-08-01

    Two of the most fundamental questions regarding the broad line region (BLR) are "what is its structure?" and "how is it moving?" Baldwin et al. (1995) showed that by summing over an ensemble of clouds at differing densities and distances from the ionising source we can easily and naturally produce a spectrum similar to what is observed for AGN. This approach is called the `locally optimally emitting clouds' (LOC) model. This approach can also explain the well-observed stratification of emission lines in the BLR (e.g. Clavel et al. 1991, Peterson et al. 1991, Kollatschny et al. 2001) and `breathing' of BLR with changes in the continuum luminosity (Netzer & Mor 1990, Peterson et al. 2014) and is therefore a generally accepted model of the BLR. However, LOC predictions require some assumptions to be made about the distribution of the clouds within the BLR. By comparing photoionization predictions, for a distribution of cloud properties, with observed spectra we can infer something about the structure of the BLR and distribution of clouds. I use existing reverberation mapping data to constrain the structure of the BLR by observing how individual line strengths and ratios of different lines change in high and low luminosity states. I will present my initial constraints and discuss the challenges associated with the method.

  7. Reproducibility of LCA models of crude oil production.

    Science.gov (United States)

    Vafi, Kourosh; Brandt, Adam R

    2014-11-04

    Scientific models are ideally reproducible, with results that converge despite varying methods. In practice, divergence between models often remains due to varied assumptions, incompleteness, or simply because of avoidable flaws. We examine LCA greenhouse gas (GHG) emissions models to test the reproducibility of their estimates for well-to-refinery inlet gate (WTR) GHG emissions. We use the Oil Production Greenhouse gas Emissions Estimator (OPGEE), an open source engineering-based life cycle assessment (LCA) model, as the reference model for this analysis. We study seven previous studies based on six models. We examine the reproducibility of prior results by successive experiments that align model assumptions and boundaries. The root-mean-square error (RMSE) between results varies between ∼1 and 8 g CO2 eq/MJ LHV when model inputs are not aligned. After model alignment, RMSE generally decreases only slightly. The proprietary nature of some of the models hinders explanations for divergence between the results. Because verification of the results of LCA GHG emissions is often not possible by direct measurement, we recommend the development of open source models for use in energy policy. Such practice will lead to iterative scientific review, improvement of models, and more reliable understanding of emissions.

  8. Reproducibility Issues : Avoiding Pitfalls in Animal Inflammation Models

    NARCIS (Netherlands)

    Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E; Clausen, Björn E.; Laman, Jon D.

    2017-01-01

    In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite

  9. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    J. de Mast; W.N. van Wieringen

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  10. Reproducibility Issues: Avoiding Pitfalls in Animal Inflammation Models.

    Science.gov (United States)

    Laman, Jon D; Kooistra, Susanne M; Clausen, Björn E

    2017-01-01

    In light of an enhanced awareness of ethical questions and ever increasing costs when working with animals in biomedical research, there is a dedicated and sometimes fierce debate concerning the (lack of) reproducibility of animal models and their relevance for human inflammatory diseases. Despite evident advancements in searching for alternatives, that is, replacing, reducing, and refining animal experiments-the three R's of Russel and Burch (1959)-understanding the complex interactions of the cells of the immune system, the nervous system and the affected tissue/organ during inflammation critically relies on in vivo models. Consequently, scientific advancement and ultimately novel therapeutic interventions depend on improving the reproducibility of animal inflammation models. As a prelude to the remaining hands-on protocols described in this volume, here, we summarize potential pitfalls of preclinical animal research and provide resources and background reading on how to avoid them.

  11. Assessment of Modeling Capability for Reproducing Storm Impacts on TEC

    Science.gov (United States)

    Shim, J. S.; Kuznetsova, M. M.; Rastaetter, L.; Bilitza, D.; Codrescu, M.; Coster, A. J.; Emery, B. A.; Foerster, M.; Foster, B.; Fuller-Rowell, T. J.; Huba, J. D.; Goncharenko, L. P.; Mannucci, A. J.; Namgaladze, A. A.; Pi, X.; Prokhorov, B. E.; Ridley, A. J.; Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Zhu, L.

    2014-12-01

    During geomagnetic storm, the energy transfer from solar wind to magnetosphere-ionosphere system adversely affects the communication and navigation systems. Quantifying storm impacts on TEC (Total Electron Content) and assessment of modeling capability of reproducing storm impacts on TEC are of importance to specifying and forecasting space weather. In order to quantify storm impacts on TEC, we considered several parameters: TEC changes compared to quiet time (the day before storm), TEC difference between 24-hour intervals, and maximum increase/decrease during the storm. We investigated the spatial and temporal variations of the parameters during the 2006 AGU storm event (14-15 Dec. 2006) using ground-based GPS TEC measurements in the selected 5 degree eight longitude sectors. The latitudinal variations were also studied in two longitude sectors among the eight sectors where data coverage is relatively better. We obtained modeled TEC from various ionosphere/thermosphere (IT) models. The parameters from the models were compared with each other and with the observed values. We quantified performance of the models in reproducing the TEC variations during the storm using skill scores. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  12. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  13. Using a 1-D model to reproduce diurnal SST signals

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.

    2014-01-01

    of measurement. A generally preferred approach to bridge the gap between in situ and remotely obtained measurements is through modelling of the upper ocean temperature. This ESA supported study focuses on the implementation of the 1 dimensional General Ocean Turbulence Model (GOTM), in order to resolve...... profiles, along with the selection of the coefficients for the 2-band parametrisation of light’s penetration in the water column, hold a key role in the agreement of the modelled output with observations. To improve the surface heat budget and the distribution of heat, the code was modified to include...... Institution Upper Ocean Processes Group archive. The successful implementation of the new parametrisations is verified while the model reproduces the diurnal signals seen from in situ measurements. Special focus is given to testing and validation of different set-ups using campaign data from the Atlantic...

  14. Venusian Polar Vortex reproduced by a general circulation model

    Science.gov (United States)

    Ando, Hiroki; Sugimoto, Norihiko; Takagi, Masahiro

    2016-10-01

    Unlike the polar vortices observed in the Earth, Mars and Titan atmospheres, the observed Venus polar vortex is warmer than the mid-latitudes at cloud-top levels (~65 km). This warm polar vortex is zonally surrounded by a cold latitude band located at ~60 degree latitude, which is a unique feature called 'cold collar' in the Venus atmosphere [e.g. Taylor et al. 1980; Piccioni et al. 2007]. Although these structures have been observed in numerous previous observations, the formation mechanism is still unknown. In addition, an axi-asymmetric feature is always seen in the warm polar vortex. It changes temporally and sometimes shows a hot polar dipole or S-shaped structure as shown by a lot of infrared measurements [e.g. Garate-Lopez et al. 2013; 2015]. However, its vertical structure has not been investigated. To solve these problems, we performed a numerical simulation of the Venus atmospheric circulation using a general circulation model named AFES for Venus [Sugimoto et al. 2014] and reproduced these puzzling features.And then, the reproduced structures of the atmosphere and the axi-asymmetirc feature are compared with some previous observational results.In addition, the quasi-periodical zonal-mean zonal wind fluctuation is also seen in the Venus polar vortex reproduced in our model. This might be able to explain some observational results [e.g. Luz et al. 2007] and implies that the polar vacillation might also occur in the Venus atmosphere, which is silimar to the Earth's polar atmosphere. We will also show some initial results about this point in this presentation.

  15. Research Spotlight: Improved model reproduces the 2003 European heat wave

    Science.gov (United States)

    Schultz, Colin

    2011-04-01

    In August 2003, record-breaking temperatures raged across much of Europe. In France, maximum temperatures of 37°C (99°F) persisted for 9 days straight, the longest such stretch since 1873. About 40,000 deaths (14,000 in France alone) were attributed to the extreme heat and low humidity. Various climate conditions must come into alignment to produce extreme weather like the 2003 heat wave, and despite a concerted effort, forecasting models have so far been unable to accurately reproduce the event—including the modern European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble modeling system for seasonal forecasts, which went into operation in 2007. (Geophysical Research Letters, doi:10.1029/2010GL046455, 2011)

  16. Reproducibility of UAV-based photogrammetric surface models

    Science.gov (United States)

    Anders, Niels; Smith, Mike; Cammeraat, Erik; Keesstra, Saskia

    2016-04-01

    Soil erosion, rapid geomorphological change and vegetation degradation are major threats to the human and natural environment in many regions. Unmanned Aerial Vehicles (UAVs) and Structure-from-Motion (SfM) photogrammetry are invaluable tools for the collection of highly detailed aerial imagery and subsequent low cost production of 3D landscapes for an assessment of landscape change. Despite the widespread use of UAVs for image acquisition in monitoring applications, the reproducibility of UAV data products has not been explored in detail. This paper investigates this reproducibility by comparing the surface models and orthophotos derived from different UAV flights that vary in flight direction and altitude. The study area is located near Lorca, Murcia, SE Spain, which is a semi-arid medium-relief locale. The area is comprised of terraced agricultural fields that have been abandoned for about 40 years and have suffered subsequent damage through piping and gully erosion. In this work we focused upon variation in cell size, vertical and horizontal accuracy, and horizontal positioning of recognizable landscape features. The results suggest that flight altitude has a significant impact on reconstructed point density and related cell size, whilst flight direction affects the spatial distribution of vertical accuracy. The horizontal positioning of landscape features is relatively consistent between the different flights. We conclude that UAV data products are suitable for monitoring campaigns for land cover purposes or geomorphological mapping, but special care is required when used for monitoring changes in elevation.

  17. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  18. A reproducible nonlethal animal model for studying cyanide poisoning.

    Science.gov (United States)

    Vick, J; Marino, M T; von Bredow, J D; Kaminskis, A; Brewer, T

    2000-12-01

    Previous studies using bolus intravenous injections of sodium cyanide have been used to model the sudden exposure to high concentrations of cyanide that could occur on the battlefield. This study was designed to develop a model that would simulate the type of exposure to cyanide gas that could happen during actual low-level continuous types of exposure and then compare it with the bolus model. Cardiovascular and respiratory recordings taken from anesthetized dogs have been used previously to characterize the lethal effects of cyanide. The intravenous, bolus injection of 2.5 mg/kg sodium cyanide provides a model in which a greater than lethal concentration is attained. In contrast, our model uses a slow, intravenous infusion of cyanide to titrate each animal to its own inherent end point, which coincides with the amount of cyanide needed to induce death through respiratory arrest. In this model, therapeutic intervention can be used to restore respiration and allow for the complete recovery of the animals. After recovery, the same animal can be given a second infusion of cyanide, followed again by treatment and recovery, providing a reproducible end point. This end point can then be expressed as the total amount of cyanide per body weight (mg/kg) required to kill. In this study, the average dose of sodium cyanide among 12 animals was 1.21 mg/kg, which is approximately half the cyanide used in the bolus model. Thus, titration to respiratory arrest followed by resuscitation provides a repetitive-use animal model that can be used to test the efficacy of various forms of pretreatment and/or therapy without the loss of a single animal.

  19. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  20. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  1. A Simple Disk Wind Model for Broad Absorption Line Quasars

    CERN Document Server

    Higginbottom, N; Long, K S; Sim, S A; Matthews, J H

    2013-01-01

    Approximately 20% of quasi-stellar objects (QSOs) exhibit broad, blue-shifted absorption lines in their ultraviolet spectra. Such features provide clear evidence for significant outflows from these systems, most likely in the form of accretion disk winds. These winds may represent the "quasar" mode of feedback that is often invoked in galaxy formation/evolution models, and they are also key to unification scenarios for active galactic nuclei (AGN) and QSOs. To test these ideas, we construct a simple benchmark model of an equatorial, biconical accretion disk wind in a QSO and use a Monte Carlo ionization/radiative transfer code to calculate the ultraviolet spectra as a function of viewing angle. We find that for plausible outflow parameters, sightlines looking directly into the wind cone do produce broad, blue-shifted absorption features in the transitions typically seen in broad absorption line QSOs. However, our benchmark model is intrinsically X-ray weak in order to prevent overionization of the outflow, an...

  2. New model for datasets citation and extraction reproducibility in VAMDC

    CERN Document Server

    Zwölf, Carlo Maria; Dubernet, Marie-Lise

    2016-01-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favour reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  3. New model for datasets citation and extraction reproducibility in VAMDC

    Science.gov (United States)

    Zwölf, Carlo Maria; Moreau, Nicolas; Dubernet, Marie-Lise

    2016-09-01

    In this paper we present a new paradigm for the identification of datasets extracted from the Virtual Atomic and Molecular Data Centre (VAMDC) e-science infrastructure. Such identification includes information on the origin and version of the datasets, references associated to individual data in the datasets, as well as timestamps linked to the extraction procedure. This paradigm is described through the modifications of the language used to exchange data within the VAMDC and through the services that will implement those modifications. This new paradigm should enforce traceability of datasets, favor reproducibility of datasets extraction, and facilitate the systematic citation of the authors having originally measured and/or calculated the extracted atomic and molecular data.

  4. Can a regional climate model reproduce observed extreme temperatures?

    Directory of Open Access Journals (Sweden)

    Peter F. Craigmile

    2013-10-01

    Full Text Available Using output from a regional Swedish climate model and observations from the Swedish synoptic observational network, we compare seasonal minimum temperatures from model output and observations using marginal extreme value modeling techniques. We make seasonal comparisons using generalized extreme value models and empirically estimate the shift in the distribution as a function of the regional climate model values, using the Doksum shift function. Spatial and temporal comparisons over south central Sweden are made by building hierarchical Bayesian generalized extreme value models for the observed minima and regional climate model output. Generally speaking the regional model is surprisingly well calibrated for minimum temperatures. We do detect a problem in the regional model to produce minimum temperatures close to 0◦C. The seasonal spatial effects are quite similar between data and regional model. The observations indicate relatively strong warming, especially in the northern region. This signal is present in the regional model, but is not as strong.

  5. A simple disc wind model for broad absorption line quasars

    Science.gov (United States)

    Higginbottom, N.; Knigge, C.; Long, K. S.; Sim, S. A.; Matthews, J. H.

    2013-12-01

    Approximately 20 per cent of quasi-stellar objects (QSOs) exhibit broad, blue-shifted absorption lines in their ultraviolet spectra. Such features provide clear evidence for significant outflows from these systems, most likely in the form of accretion disc winds. These winds may represent the `quasar' mode of feedback that is often invoked in galaxy formation/evolution models, and they are also key to unification scenarios for active galactic nuclei (AGN) and QSOs. To test these ideas, we construct a simple benchmark model of an equatorial, biconical accretion disc wind in a QSO and use a Monte Carlo ionization/radiative transfer code to calculate the ultraviolet spectra as a function of viewing angle. We find that for plausible outflow parameters, sightlines looking directly into the wind cone do produce broad, blue-shifted absorption features in the transitions typically seen in broad absorption line (BAL) QSOs. However, our benchmark model is intrinsically X-ray weak in order to prevent overionization of the outflow, and the wind does not yet produce collisionally excited line emission at the level observed in non-BAL QSOs. As a first step towards addressing these shortcomings, we discuss the sensitivity of our results to changes in the assumed X-ray luminosity and mass-loss rate, Ṁwind. In the context of our adopted geometry, Ṁwind ˜ Ṁacc is required in order to produce significant BAL features. The kinetic luminosity and momentum carried by such outflows would be sufficient to provide significant feedback.

  6. A model project for reproducible papers: critical temperature for the Ising model on a square lattice

    CERN Document Server

    Dolfi, M; Hehn, A; Imriška, J; Pakrouski, K; Rønnow, T F; Troyer, M; Zintchenko, I; Chirigati, F; Freire, J; Shasha, D

    2014-01-01

    In this paper we present a simple, yet typical simulation in statistical physics, consisting of large scale Monte Carlo simulations followed by an involved statistical analysis of the results. The purpose is to provide an example publication to explore tools for writing reproducible papers. The simulation estimates the critical temperature where the Ising model on the square lattice becomes magnetic to be Tc /J = 2.26934(6) using a finite size scaling analysis of the crossing points of Binder cumulants. We provide a virtual machine which can be used to reproduce all figures and results.

  7. A structured model of video reproduces primary visual cortical organisation.

    Directory of Open Access Journals (Sweden)

    Pietro Berkes

    2009-09-01

    Full Text Available The visual system must learn to infer the presence of objects and features in the world from the images it encounters, and as such it must, either implicitly or explicitly, model the way these elements interact to create the image. Do the response properties of cells in the mammalian visual system reflect this constraint? To address this question, we constructed a probabilistic model in which the identity and attributes of simple visual elements were represented explicitly and learnt the parameters of this model from unparsed, natural video sequences. After learning, the behaviour and grouping of variables in the probabilistic model corresponded closely to functional and anatomical properties of simple and complex cells in the primary visual cortex (V1. In particular, feature identity variables were activated in a way that resembled the activity of complex cells, while feature attribute variables responded much like simple cells. Furthermore, the grouping of the attributes within the model closely parallelled the reported anatomical grouping of simple cells in cat V1. Thus, this generative model makes explicit an interpretation of complex and simple cells as elements in the segmentation of a visual scene into basic independent features, along with a parametrisation of their moment-by-moment appearances. We speculate that such a segmentation may form the initial stage of a hierarchical system that progressively separates the identity and appearance of more articulated visual elements, culminating in view-invariant object recognition.

  8. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  9. Reproducible Infection Model for Clostridium perfringens in Broiler Chickens

    DEFF Research Database (Denmark)

    Pedersen, Karl; Friis-Holm, Lotte Bjerrum; Heuer, Ole Eske

    2008-01-01

    Experiments were carried out to establish an infection and disease model for Clostridium perfringens in broiler chickens. Previous experiments had failed to induce disease and only a transient colonization with challenge strains had been obtained. In the present study, two series of experiments w...

  10. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  11. Accuracy and reproducibility of measurements on plaster models and digital models created using an intraoral scanner.

    Science.gov (United States)

    Camardella, Leonardo Tavares; Breuning, Hero; de Vasconcellos Vilella, Oswaldo

    2017-05-01

    The purpose of the present study was to evaluate the accuracy and reproducibility of measurements made on digital models created using an intraoral color scanner compared to measurements on dental plaster models. This study included impressions of 28 volunteers. Alginate impressions were used to make plaster models, and each volunteers' dentition was scanned with a TRIOS Color intraoral scanner. Two examiners performed measurements on the plaster models using a digital caliper and measured the digital models using Ortho Analyzer software. The examiners measured 52 distances, including tooth diameter and height, overjet, overbite, intercanine and intermolar distances, and the sagittal relationship. The paired t test was used to assess intra-examiner performance and measurement accuracy of the two examiners for both plaster and digital models. The level of clinically relevant differences between the measurements according to the threshold used was evaluated and a formula was applied to calculate the chance of finding clinically relevant errors on measurements on plaster and digital models. For several parameters, statistically significant differences were found between the measurements on the two different models. However, most of these discrepancies were not considered clinically significant. The measurement of the crown height of upper central incisors had the highest measurement error for both examiners. Based on the interexaminer performance, reproducibility of the measurements was poor for some of the parameters. Overall, our findings showed that most of the measurements on digital models created using the TRIOS Color scanner and measured with Ortho Analyzer software had a clinically acceptable accuracy compared to the same measurements made with a caliper on plaster models, but the measuring method can affect the reproducibility of the measurements.

  12. Spatiotemporal exploratory models for broad-scale survey data.

    Science.gov (United States)

    Fink, Daniel; Hochachka, Wesley M; Zuckerberg, Benjamin; Winkler, David W; Shaby, Ben; Munson, M Arthur; Hooker, Giles; Riedewald, Mirek; Sheldon, Daniel; Kelling, Steve

    2010-12-01

    The distributions of animal populations change and evolve through time. Migratory species exploit different habitats at different times of the year. Biotic and abiotic features that determine where a species lives vary due to natural and anthropogenic factors. This spatiotemporal variation needs to be accounted for in any modeling of species' distributions. In this paper we introduce a semiparametric model that provides a flexible framework for analyzing dynamic patterns of species occurrence and abundance from broad-scale survey data. The spatiotemporal exploratory model (STEM) adds essential spatiotemporal structure to existing techniques for developing species distribution models through a simple parametric structure without requiring a detailed understanding of the underlying dynamic processes. STEMs use a multi-scale strategy to differentiate between local and global-scale spatiotemporal structure. A user-specified species distribution model accounts for spatial and temporal patterning at the local level. These local patterns are then allowed to "scale up" via ensemble averaging to larger scales. This makes STEMs especially well suited for exploring distributional dynamics arising from a variety of processes. Using data from eBird, an online citizen science bird-monitoring project, we demonstrate that monthly changes in distribution of a migratory species, the Tree Swallow (Tachycineta bicolor), can be more accurately described with a STEM than a conventional bagged decision tree model in which spatiotemporal structure has not been imposed. We also demonstrate that there is no loss of model predictive power when a STEM is used to describe a spatiotemporal distribution with very little spatiotemporal variation; the distribution of a nonmigratory species, the Northern Cardinal (Cardinalis cardinalis).

  13. Bitwise identical compiling setup: prospective for reproducibility and reliability of earth system modeling

    Directory of Open Access Journals (Sweden)

    R. Li

    2015-11-01

    Full Text Available Reproducibility and reliability are fundamental principles of scientific research. A compiling setup that includes a specific compiler version and compiler flags is essential technical supports for Earth system modeling. With the fast development of computer software and hardware, compiling setup has to be updated frequently, which challenges the reproducibility and reliability of Earth system modeling. The existing results of a simulation using an original compiling setup may be irreproducible by a newer compiling setup because trivial round-off errors introduced by the change of compiling setup can potentially trigger significant changes in simulation results. Regarding the reliability, a compiler with millions of lines of codes may have bugs that are easily overlooked due to the uncertainties or unknowns in Earth system modeling. To address these challenges, this study shows that different compiling setups can achieve exactly the same (bitwise identical results in Earth system modeling, and a set of bitwise identical compiling setups of a model can be used across different compiler versions and different compiler flags. As a result, the original results can be more easily reproduced; for example, the original results with an older compiler version can be reproduced exactly with a newer compiler version. Moreover, this study shows that new test cases can be generated based on the differences of bitwise identical compiling setups between different models, which can help detect software bugs or risks in the codes of models and compilers and finally improve the reliability of Earth system modeling.

  14. Accuracy and reproducibility of dental replica models reconstructed by different rapid prototyping techniques

    NARCIS (Netherlands)

    Hazeveld, Aletta; Huddleston Slater, James J. R.; Ren, Yijin

    INTRODUCTION: Rapid prototyping is a fast-developing technique that might play a significant role in the eventual replacement of plaster dental models. The aim of this study was to investigate the accuracy and reproducibility of physical dental models reconstructed from digital data by several rapid

  15. Voxel-level reproducibility assessment of modality independent elastography in a pre-clinical murine model

    Science.gov (United States)

    Flint, Katelyn M.; Weis, Jared A.; Yankeelov, Thomas E.; Miga, Michael I.

    2015-03-01

    Changes in tissue mechanical properties, measured non-invasively by elastography methods, have been shown to be an important diagnostic tool, particularly for cancer. Tissue elasticity information, tracked over the course of therapy, may be an important prognostic indicator of tumor response to treatment. While many elastography techniques exist, this work reports on the use of a novel form of elastography that uses image texture to reconstruct elastic property distributions in tissue (i.e., a modality independent elastography (MIE) method) within the context of a pre-clinical breast cancer system.1,2 The elasticity results have previously shown good correlation with independent mechanical testing.1 Furthermore, MIE has been successfully utilized to localize and characterize lesions in both phantom experiments and simulation experiments with clinical data.2,3 However, the reproducibility of this method has not been characterized in previous work. The goal of this study is to evaluate voxel-level reproducibility of MIE in a pre-clinical model of breast cancer. Bland-Altman analysis of co-registered repeat MIE scans in this preliminary study showed a reproducibility index of 24.7% (scaled to a percent of maximum stiffness) at the voxel level. As opposed to many reports in the magnetic resonance elastography (MRE) literature that speak to reproducibility measures of the bulk organ, these results establish MIE reproducibility at the voxel level; i.e., the reproducibility of locally-defined mechanical property measurements throughout the tumor volume.

  16. Model Invariance across Genders of the Broad Autism Phenotype Questionnaire

    Science.gov (United States)

    Broderick, Neill; Wade, Jordan L.; Meyer, J. Patrick; Hull, Michael; Reeve, Ronald E.

    2015-01-01

    ASD is one of the most heritable neuropsychiatric disorders, though comprehensive genetic liability remains elusive. To facilitate genetic research, researchers employ the concept of the broad autism phenotype (BAP), a milder presentation of traits in undiagnosed relatives. Research suggests that the BAP Questionnaire (BAPQ) demonstrates…

  17. Broad range of 2050 warming from an observationally constrained large climate model ensemble

    Science.gov (United States)

    Rowlands, Daniel J.; Frame, David J.; Ackerley, Duncan; Aina, Tolu; Booth, Ben B. B.; Christensen, Carl; Collins, Matthew; Faull, Nicholas; Forest, Chris E.; Grandey, Benjamin S.; Gryspeerdt, Edward; Highwood, Eleanor J.; Ingram, William J.; Knight, Sylvia; Lopez, Ana; Massey, Neil; McNamara, Frances; Meinshausen, Nicolai; Piani, Claudio; Rosier, Suzanne M.; Sanderson, Benjamin M.; Smith, Leonard A.; Stone, Dáithí A.; Thurston, Milo; Yamazaki, Kuniko; Hiro Yamazaki, Y.; Allen, Myles R.

    2012-04-01

    Incomplete understanding of three aspects of the climate system--equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing--and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century. Explorations of these uncertainties have so far relied on scaling approaches, large ensembles of simplified climate models, or small ensembles of complex coupled atmosphere-ocean general circulation models which under-represent uncertainties in key climate system properties derived from independent sources. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere-ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4-3K by 2050, relative to 1961-1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report, but extends towards larger warming than observed in ensembles-of-opportunity typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range `no mitigation' scenario for greenhouse-gas emissions.

  18. A standardised and reproducible model of intra-abdominal infection and abscess formation in rats

    NARCIS (Netherlands)

    Bosscha, K; Nieuwenhuijs, VB; Gooszen, AW; van Duijvenbode-Beumer, H; Visser, MR; Verweij, Willem; Akkermans, LMA

    2000-01-01

    Objective: To develop a standardised and reproducible model of intra-abdominal infection and abscess formation in rats. Design: Experimental study. Setting: University hospital, The Netherlands. Subjects: 36 adult male Wistar rats. Interventions: In 32 rats, peritonitis was produced using two differ

  19. A force-based model to reproduce stop-and-go waves in pedestrian dynamics

    CERN Document Server

    Chraibi, Mohcine; Schadschneider, Andreas

    2015-01-01

    Stop-and-go waves in single-file movement are a phenomenon that is ob- served empirically in pedestrian dynamics. It manifests itself by the co-existence of two phases: moving and stopping pedestrians. We show analytically based on a simplified one-dimensional scenario that under some conditions the system can have instable homogeneous solutions. Hence, oscillations in the trajectories and in- stabilities emerge during simulations. To our knowledge there exists no force-based model which is collision- and oscillation-free and meanwhile can reproduce phase separation. We develop a new force-based model for pedestrian dynamics able to reproduce qualitatively the phenomenon of phase separation. We investigate analytically the stability condition of the model and define regimes of parameter values where phase separation can be observed. We show by means of simulations that the predefined conditions lead in fact to the expected behavior and validate our model with respect to empirical findings.

  20. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  1. Anatomical Reproducibility of a Head Model Molded by a Three-dimensional Printer.

    Science.gov (United States)

    Kondo, Kosuke; Nemoto, Masaaki; Masuda, Hiroyuki; Okonogi, Shinichi; Nomoto, Jun; Harada, Naoyuki; Sugo, Nobuo; Miyazaki, Chikao

    2015-01-01

    We prepared rapid prototyping models of heads with unruptured cerebral aneurysm based on image data of computed tomography angiography (CTA) using a three-dimensional (3D) printer. The objective of this study was to evaluate the anatomical reproducibility and accuracy of these models by comparison with the CTA images on a monitor. The subjects were 22 patients with unruptured cerebral aneurysm who underwent preoperative CTA. Reproducibility of the microsurgical anatomy of skull bone and arteries, the length and thickness of the main arteries, and the size of cerebral aneurysm were compared between the CTA image and rapid prototyping model. The microsurgical anatomy and arteries were favorably reproduced, apart from a few minute regions, in the rapid prototyping models. No significant difference was noted in the measured lengths of the main arteries between the CTA image and rapid prototyping model, but errors were noted in their thickness (p 3D printer. It was concluded that these models are useful tools for neurosurgical simulation. The thickness of the main arteries and size of cerebral aneurysm should be comprehensively judged including other neuroimaging in consideration of errors.

  2. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Jun-fang, E-mail: tianhustbjtu@hotmail.com [MOE Key Laboratory for Urban Transportation Complex Systems Theory and Technology, Beijing Jiaotong University, Beijing 100044 (China); Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao [MOE Key Laboratory for Urban Transportation Complex Systems Theory and Technology, Beijing Jiaotong University, Beijing 100044 (China)

    2012-09-10

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  3. A Broad Dynamical Model for Pattern Formation by Lateral Inhibition

    CERN Document Server

    Arcak, Murat

    2012-01-01

    Many patterning events in multi-cellular organisms rely on cell-to-cell contact signaling, such as the Notch pathway in metazoans. A particularly interesting phenomenon in this form of communication is lateral inhibition where a cell that adopts a particular fate inhibits its immediate neighbors from doing the same. Dynamical models are of great interest for understanding the circuit topologies involved in lateral inhibition and for predicting the associated patterns. Several simplified models have been employed for Notch signalling pathways in the literature. The objective of this paper is to present an abstract dynamical model that captures the essential features of lateral inhibition and to demonstrate with dynamical systems techniques that these features indeed lead to patterning.

  4. Hidden-variable models for the spin singlet: I. Non-local theories reproducing quantum mechanics

    CERN Document Server

    Di Lorenzo, Antonio

    2011-01-01

    A non-local hidden variable model reproducing the quantum mechanical probabilities for a spin singlet is presented. The non-locality is concentrated in the distribution of the hidden variables. The model otherwise satisfies both the hypothesis of outcome independence, made in the derivation of Bell inequality, and of compliance with Malus's law, made in the derivation of Leggett inequality. It is shown through the prescription of a protocol that the non-locality can be exploited to send information instantaneously provided that the hidden variables can be measured, even though they cannot be controlled.

  5. On some problems with reproducing the Standard Model fields and interactions in five-dimensional warped brane world models

    CERN Document Server

    Smolyakov, Mikhail N

    2015-01-01

    In the present paper we discuss some problems which arise, when the matter, gauge and Higgs fields are allowed to propagate in the bulk of five-dimensional brane world models with compact extra dimension and their zero Kaluza-Klein modes are supposed to exactly reproduce the Standard Model fields and their interactions.

  6. Reverberation Modeling of the Broad Emission Line Region in NGC 5548

    Science.gov (United States)

    Bottorff, M. C.; Korista, K. T.; Shlosman, I.; Blandford, R. D.

    Long-term observations of broad-line region (BLR) in the Seyfert~1 galaxy NGC~5548 are analyzed and a critical comparison with the predictions of a hydromagnetically-driven outflow model of Emmering, Blandford and Shlosman is provided. This model is used to generate a time series of C~IV line profiles that have responded to a time varying continuum. We include cloud emission anisotropy, cloud obscuration, a CLOUDY-generated emissivity function and a narrow-line component which is added to the BLR component to generate the total line profiles. The model is driven with continuum input based on the monitoring campaigns of NGC~5548 reported in Clavel et al. and Korista et al., and the line strengths, profiles and lags are compared with the observations. The model is able to reproduce the basic features of CIV line variability in this active galactic nucleus, i.e., time evolution of the profile shape and strength of the C~IV emission line without varying the model parameters. The best fit model provides the effective size, the dominant geometry, the emissivity distribution and the 3D velocity field of the C~IV BLR and constrains the mass of the central black hole to about $3\\times 10^7\\ M_{\\odot}$. The inner part of the wind in NGC~5548 appears to be responsible for the anisotropically emitted CIV line, while its outer part remains dusty and molecular, thus having similar spectral characteristics to a molecular torus, although its dynamics is fundamentally different. The model predicts a differential response across the C~IV line profile, producing a red-side-first response in the relative velocity interval of $3,000-6,000 {\\rm km\\ s^{-1}}$ followed by the blue mid-wing and finally by the line core. Given that no adequate method in computing the errors for data lags and centroids exists in the literature, the {\\it data} cross-correlation function provides results which appear inconclusive, making any direct comparison with the model premature. Overall analysis

  7. Reproducibility blues.

    Science.gov (United States)

    Pulverer, Bernd

    2015-11-12

    Research findings advance science only if they are significant, reliable and reproducible. Scientists and journals must publish robust data in a way that renders it optimally reproducible. Reproducibility has to be incentivized and supported by the research infrastructure but without dampening innovation.

  8. Current reinforcement model reproduces center-in-center vein trajectory of Physarum polycephalum.

    Science.gov (United States)

    Akita, Dai; Schenz, Daniel; Kuroda, Shigeru; Sato, Katsuhiko; Ueda, Kei-Ichi; Nakagaki, Toshiyuki

    2017-06-01

    Vein networks span the whole body of the amoeboid organism in the plasmodial slime mould Physarum polycephalum, and the network topology is rearranged within an hour in response to spatio-temporal variations of the environment. It has been reported that this tube morphogenesis is capable of solving mazes, and a mathematical model, named the 'current reinforcement rule', was proposed based on the adaptability of the veins. Although it is known that this model works well for reproducing some key characters of the organism's maze-solving behaviour, one important issue is still open: In the real organism, the thick veins tend to trace the shortest possible route by cutting the corners at the turn of corridors, following a center-in-center trajectory, but it has not yet been examined whether this feature also appears in the mathematical model, using corridors of finite width. In this report, we confirm that the mathematical model reproduces the center-in-center trajectory of veins around corners observed in the maze-solving experiment. © 2017 Japanese Society of Developmental Biologists.

  9. [Amniocentesis trainer: development of a cheap and reproducible new training model].

    Science.gov (United States)

    Tassin, M; Cordier, A-G; Laher, G; Benachi, A; Mandelbrot, L

    2012-11-01

    Amniocentesis is the most common invasive procedure for prenatal diagnosis. It is essential to master this sampling technique prior to performing more complex ultrasound-guided interventions (cordocentesis, drain insertion). Training is a challenge because of the risks associated with the procedure, as well as the impact on the patient's anxiety. An amniocentesis simulator allows for safe training and repeats interventions, thus accelerating the learning curve, and also allows for periodic evaluation of proficiency. We present here a new, simple, and cost-effective amniotrainer model that reproduces real life conditions, using chicken breast and condoms filled with water.

  10. Extreme Rainfall Events Over Southern Africa: Assessment of a Climate Model to Reproduce Daily Extremes

    Science.gov (United States)

    Williams, C.; Kniveton, D.; Layberry, R.

    2007-12-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable extreme events, due to a number of factors including extensive poverty, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of a state-of-the-art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. Once the model's ability to reproduce extremes has been assessed, idealised regions of SST anomalies are used to force the model, with the overall aim of investigating the ways in which SST anomalies influence rainfall extremes over southern Africa. In this paper, results from sensitivity testing of the UK Meteorological Office Hadley Centre's climate model's domain size are firstly presented. Then simulations of current climate from the model, operating in both regional and global mode, are compared to the MIRA dataset at daily timescales. Thirdly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset. Finally, the results from the idealised SST experiments are briefly presented, suggesting associations between rainfall extremes and both local and remote SST anomalies.

  11. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  12. Digital versus plaster study models: how accurate and reproducible are they?

    Science.gov (United States)

    Abizadeh, Neilufar; Moles, David R; O'Neill, Julian; Noar, Joseph H

    2012-09-01

    To compare measurements of occlusal relationships and arch dimensions taken from digital study models with those taken from plaster models. Laboratory study The Orthodontic Department, Kettering General Hospital, Kettering, UK Methods and materials: One hundred and twelve sets of study models with a range of malocclusions and various degrees of crowding were selected. Occlusal features were measured manually with digital callipers on the plaster models. The same measurements were performed on digital images of the study models. Each method was carried out twice in order to check for intra-operator variability. The repeatability and reproducibility of the methods was assessed. Statistically significant differences between the two methods were found. In 8 of the 16 occlusal features measured, the plaster measurements were more repeatable. However, those differences were not of sufficient magnitude to have clinical relevance. In addition there were statistically significant systematic differences for 12 of the 16 occlusal features, with the plaster measurements being greater for 11 of these, indicating the digital model scans were not a true 11 representation of the plaster models. The repeatability of digital models compared with plaster models is satisfactory for clinical applications, although this study demonstrated some systematic differences. Digital study models can therefore be considered for use as an adjunct to clinical assessment of the occlusion, but as yet may not supersede current methods for scientific purposes.

  13. Fourier modeling of the BOLD response to a breath-hold task: Optimization and reproducibility.

    Science.gov (United States)

    Pinto, Joana; Jorge, João; Sousa, Inês; Vilela, Pedro; Figueiredo, Patrícia

    2016-07-15

    Cerebrovascular reactivity (CVR) reflects the capacity of blood vessels to adjust their caliber in order to maintain a steady supply of brain perfusion, and it may provide a sensitive disease biomarker. Measurement of the blood oxygen level dependent (BOLD) response to a hypercapnia-inducing breath-hold (BH) task has been frequently used to map CVR noninvasively using functional magnetic resonance imaging (fMRI). However, the best modeling approach for the accurate quantification of CVR maps remains an open issue. Here, we compare and optimize Fourier models of the BOLD response to a BH task with a preparatory inspiration, and assess the test-retest reproducibility of the associated CVR measurements, in a group of 10 healthy volunteers studied over two fMRI sessions. Linear combinations of sine-cosine pairs at the BH task frequency and its successive harmonics were added sequentially in a nested models approach, and were compared in terms of the adjusted coefficient of determination and corresponding variance explained (VE) of the BOLD signal, as well as the number of voxels exhibiting significant BOLD responses, the estimated CVR values, and their test-retest reproducibility. The brain average VE increased significantly with the Fourier model order, up to the 3rd order. However, the number of responsive voxels increased significantly only up to the 2nd order, and started to decrease from the 3rd order onwards. Moreover, no significant relative underestimation of CVR values was observed beyond the 2nd order. Hence, the 2nd order model was concluded to be the optimal choice for the studied paradigm. This model also yielded the best test-retest reproducibility results, with intra-subject coefficients of variation of 12 and 16% and an intra-class correlation coefficient of 0.74. In conclusion, our results indicate that a Fourier series set consisting of a sine-cosine pair at the BH task frequency and its two harmonics is a suitable model for BOLD-fMRI CVR measurements

  14. On the reproducibility of spatiotemporal traffic dynamics with microscopic traffic models

    CERN Document Server

    Knorr, Florian

    2012-01-01

    Traffic flow is a very prominent example of a driven non-equilibrium system. A characteristic phenomenon of traffic dynamics is the spontaneous and abrupt drop of the average velocity on a stretch of road leading to congestion. Such a traffic breakdown corresponds to a boundary-induced phase transition from free flow to congested traffic. In this paper, we study the ability of selected microscopic traffic models to reproduce a traffic breakdown, and we investigate its spatiotemporal dynamics. For our analysis, we use empirical traffic data from stationary loop detectors on a German Autobahn showing a spontaneous breakdown. We then present several methods to assess the results and compare the models with each other. In addition, we will also discuss some important modeling aspects and their impact on the resulting spatiotemporal pattern. The investigation of different downstream boundary conditions, for example, shows that the physical origin of the traffic breakdown may be artificially induced by the setup of...

  15. On the reproducibility of spatiotemporal traffic dynamics with microscopic traffic models

    Science.gov (United States)

    Knorr, Florian; Schreckenberg, Michael

    2012-10-01

    Traffic flow is a very prominent example of a driven non-equilibrium system. A characteristic phenomenon of traffic dynamics is the spontaneous and abrupt drop of the average velocity on a stretch of road leading to congestion. Such a traffic breakdown corresponds to a boundary-induced phase transition from free flow to congested traffic. In this paper, we study the ability of selected microscopic traffic models to reproduce a traffic breakdown, and we investigate its spatiotemporal dynamics. For our analysis, we use empirical traffic data from stationary loop detectors on a German Autobahn showing a spontaneous breakdown. We then present several methods to assess the results and compare the models with each other. In addition, we will also discuss some important modeling aspects and their impact on the resulting spatiotemporal pattern. The investigation of different downstream boundary conditions, for example, shows that the physical origin of the traffic breakdown may be artificially induced by the setup of the boundaries.

  16. Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Watanabe-Kanno

    2009-09-01

    Full Text Available The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner and 0.80 ± 0.19 (inter-examiner. The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05, although the differences were considered clinically insignificant (differences < 0.1 mm. The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.

  17. Quantitative Evaluation of Ionosphere Models for Reproducing Regional TEC During Geomagnetic Storms

    Science.gov (United States)

    Shim, J. S.; Kuznetsova, M.; Rastaetter, L.; Bilitza, D.; Codrescu, M.; Coster, A. J.; Emery, B.; Foster, B.; Fuller-Rowell, T. J.; Goncharenko, L. P.; Huba, J.; Mitchell, C. N.; Ridley, A. J.; Fedrizzi, M.; Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Zhu, L.

    2015-12-01

    TEC (Total Electron Content) is one of the key parameters in description of the ionospheric variability that has influence on the accuracy of navigation and communication systems. To assess current TEC modeling capability of ionospheric models during geomagnetic storms and to establish a baseline against which future improvement can be compared, we quantified the ionospheric models' performance by comparing modeled vertical TEC values with ground-based GPS TEC measurements and Multi-Instrument Data Analysis System (MIDAS) TEC. The comparison focused on North America and Europe sectors during selected two storm events: 2006 AGU storm (14-15 Dec. 2006) and 2013 March storm (17-19 Mar. 2013). The ionospheric models used for this study range from empirical to physics-based, and physics-based data assimilation models. We investigated spatial and temporal variations of TEC during the storms. In addition, we considered several parameters to quantify storm impacts on TEC: TEC changes compared to quiet time, rate of TEC change, and maximum increase/decrease during the storms. In this presentation, we focus on preliminary results of the comparison of the models performance in reproducing the storm-time TEC variations using the parameters and skill scores. This study has been supported by the Community Coordinated Modeling Center (CCMC) at the Goddard Space Flight Center. Model outputs and observational data used for the study will be permanently posted at the CCMC website (http://ccmc.gsfc.nasa.gov) for the space science communities to use.

  18. Assessment of the reliability of reproducing two-dimensional resistivity models using an image processing technique.

    Science.gov (United States)

    Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu

    2014-01-01

    This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.

  19. Classical signal model reproducing quantum probabilities for single and coincidence detections

    Science.gov (United States)

    Khrennikov, Andrei; Nilsson, Börje; Nordebo, Sven

    2012-05-01

    We present a simple classical (random) signal model reproducing Born's rule. The crucial point of our approach is that the presence of detector's threshold and calibration procedure have to be treated not as simply experimental technicalities, but as the basic counterparts of the theoretical model. We call this approach threshold signal detection model (TSD). The experiment on coincidence detection which was done by Grangier in 1986 [22] played a crucial role in rejection of (semi-)classical field models in favour of quantum mechanics (QM): impossibility to resolve the wave-particle duality in favour of a purely wave model. QM predicts that the relative probability of coincidence detection, the coefficient g(2) (0), is zero (for one photon states), but in (semi-)classical models g(2)(0) >= 1. In TSD the coefficient g(2)(0) decreases as 1/ɛ2d, where ɛd > 0 is the detection threshold. Hence, by increasing this threshold an experimenter can make the coefficient g(2) (0) essentially less than 1. The TSD-prediction can be tested experimentally in new Grangier type experiments presenting a detailed monitoring of dependence of the coefficient g(2)(0) on the detection threshold. Structurally our model has some similarity with the prequantum model of Grossing et al. Subquantum stochasticity is composed of the two counterparts: a stationary process in the space of internal degrees of freedom and the random walk type motion describing the temporal dynamics.

  20. Accuracy and reproducibility of linear measurements of resin, plaster, digital and printed study-models.

    Science.gov (United States)

    Saleh, Waleed K; Ariffin, Emy; Sherriff, Martyn; Bister, Dirk

    2015-01-01

    To compare the accuracy and reproducibility of measurements of on-screen three-dimensional (3D) digital surface models captured by a 3Shape R700™ laser-scanner, with measurements made using a digital caliper on acrylic, plaster models or model replicas. Four sets of typodont models were used. Acrylic models, alginate impressions, plaster models and physical replicas were measured. The 3Shape R700™ laser-scanning device with 3Shape™ software was used for scans and measurements. Linear measurements were recorded for selected landmarks, on each of the physical models and on the 3D digital surface models on ten separate occasions by a single examiner. Comparing measurements taken on the physical models the mean difference of the measurements was 0.32 mm (SD 0.15 mm). For the different methods (physical versus digital) the mean difference was 0.112 mm (SD 0.15 mm). None of the values showed a statistically significant difference (p plaster and acrylic models. The comparison of measurements on the physical models showed no significant difference. The 3Shape R700™ is a reliable device for capturing surface details of models in a digital format. When comparing measurements taken manually and digitally there was no statistically significant difference. The Objet Eden 250™ 3D prints proved to be as accurate as the original acrylic, plaster, or alginate impressions as was shown by the accuracy of the measurements taken. This confirms that using virtual study models can be a reliable method, replacing traditional plaster models.

  1. Reproducibility of VPCT parameters in the normal pancreas: comparison of two different kinetic calculation models.

    Science.gov (United States)

    Kaufmann, Sascha; Schulze, Maximilian; Horger, Thomas; Oelker, Aenne; Nikolaou, Konstantin; Horger, Marius

    2015-09-01

    To assess the reproducibility of volume computed tomographic perfusion (VPCT) measurements in normal pancreatic tissue using two different kinetic perfusion calculation models at three different time points. Institutional ethical board approval was obtained for retrospective analysis of pancreas perfusion data sets generated by our prospective study for liver response monitoring to local therapy in patients experiencing unresectable hepatocellular carcinoma, which was approved by the institutional review board. VPCT of the entire pancreas was performed in 41 patients (mean age, 64.8 years) using 26 consecutive volume measurements and intravenous injection of 50 mL of iodinated contrast at a flow rate of 5 mL/s. Blood volume(BV) and blood flow (BF) were calculated using two mathematical methods: maximum slope + Patlak analysis versus deconvolution method. Pancreas perfusion was calculated using two volume of interests. Median interval between the first and the second VPCT was 2 days and between the second and the third VPCT 82 days. Variability was assessed with within-patient coefficients of variation (CVs) and Bland-Altman analyses. Interobserver agreement for all perfusion parameters was calculated using intraclass correlation coefficients (ICCs). BF and BV values varied widely by method of analysis as did within-patient CVs for BF and BV at the second versus the first VPCT by 22.4%/50.4% (method 1) and 24.6%/24.0% (method 2) measured in the pancreatic head and 18.4%/62.6% (method 1) and 23.8%/28.1% (method 2) measured in the pancreatic corpus and at the third versus the first VPCT by 21.7%/61.8% (method 1) and 25.7%/34.5% (method 2) measured also in the pancreatic head and 19.1%/66.1% (method 1) and 22.0%/31.8% (method 2) measured in the pancreatic corpus, respectively. Interobserver agreement measured with ICC shows fair-to-good reproducibility. VPCT performed with the presented examinational protocol is reproducible and can be used for monitoring

  2. Classical signal model reproducing quantum probabilities for single and coincidence detections

    CERN Document Server

    Khrennikov, Andrei; Nordebo, Sven

    2011-01-01

    We present a simple classical (random) signal model reproducing Born's rule. The crucial point of our approach is that the presence of detector's threshold and calibration procedure have to be treated not as simply experimental technicalities, but as the basic counterparts of the theoretical model. We call this approach threshold signal detection model (TSD). The experiment on coincidence detection which was done by Grangier in 1986 \\cite{Grangier} played a crucial role in rejection of (semi-)classical field models in favor of quantum mechanics (QM): impossibility to resolve the wave-particle duality in favor of a purely wave model. QM predicts that the relative probability of coincidence detection, the coefficient $g^{(2)}(0),$ is zero (for one photon states), but in (semi-)classical models $g^{(2)}(0)\\geq 1.$ In TSD the coefficient $g^{(2)}(0)$ decreases as $1/{\\cal E}_d^2,$ where ${\\cal E}_d>0$ is the detection threshold. Hence, by increasing this threshold an experimenter can make the coefficient $g^{(2)}...

  3. An analytical nonlinear model for laminate multiferroic composites reproducing the DC magnetic bias dependent magnetoelectric properties.

    Science.gov (United States)

    Lin, Lizhi; Wan, Yongping; Li, Faxin

    2012-07-01

    In this work, we propose an analytical nonlinear model for laminate multiferroic composites in which the magnetic-field-induced strain in magnetostrictive phase is described by a standard square law taking the stress effect into account, whereas the ferroelectric phase retains a linear piezoelectric response. Furthermore, differing from previous models which assume uniform deformation, we take into account the stress attenuation and adopt non-uniform deformation along the layer thickness in both piezoelectric and magnetostrictive phases. Analysis of this model on L-T and L-L modes of sandwiched Terfenol-D/lead zirconate titanate/Terfenol-D composites can well reproduce the observed dc magnetic field (H(dc)) dependent magnetoelectric coefficients, which reach their maximum with the H(dc) all at about 500 Oe. The model also suggests that stress attenuation along the layer thickness in practical composites should be taken into account. Furthermore, the model also indicates that a high volume fraction of magnetostrictive phase is required to get giant magnetoelectric coupling, coinciding with existing models.

  4. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  5. Models that include supercoiling of topological domains reproduce several known features of interphase chromosomes.

    Science.gov (United States)

    Benedetti, Fabrizio; Dorier, Julien; Burnier, Yannis; Stasiak, Andrzej

    2014-03-01

    Understanding the structure of interphase chromosomes is essential to elucidate regulatory mechanisms of gene expression. During recent years, high-throughput DNA sequencing expanded the power of chromosome conformation capture (3C) methods that provide information about reciprocal spatial proximity of chromosomal loci. Since 2012, it is known that entire chromatin in interphase chromosomes is organized into regions with strongly increased frequency of internal contacts. These regions, with the average size of ∼1 Mb, were named topological domains. More recent studies demonstrated presence of unconstrained supercoiling in interphase chromosomes. Using Brownian dynamics simulations, we show here that by including supercoiling into models of topological domains one can reproduce and thus provide possible explanations of several experimentally observed characteristics of interphase chromosomes, such as their complex contact maps.

  6. A novel, stable and reproducible acute lung injury model induced by oleic acid in immature piglet

    Institute of Scientific and Technical Information of China (English)

    ZHU Yao-bin; LING Feng; ZHANG Yan-bo; LIU Ai-jun; LIU Dong-hai; QIAO Chen-hui; WANG Qiang; LIU Ying-long

    2011-01-01

    Background Young children are susceptible to pulmonary injury,and acute lung injury (ALl) often results in a high mortality and financial costs in pediatric patients.A good ALl model will help us to gain a better understanding of the real pathophysiological picture and to evaluate novel treatment approaches to acute respiratory distress syndrome (ARDS) more accurately and liberally.This study aimed to establish a hemodynamically stable and reproducible model with ALl in piglet induced by oleic acid.Methods Six Chinese mini-piglets were used to establish ALl models by oleic acid.Hemodynamic and pulmonary function data were measured.Histopathological assessment was performed.Results Mean blood pressure,heart rate (HR),cardiac output (CO),central venous pressure (CVP) and left atrial pressure (LAP) were sharply decreased after oleic acid given,while the mean pulmonary arterial pressure (MPAP) was increased in comparison with baseline (P <0.05).pH,arterial partial pressure of O2 (PaO2),PaO2/inspired O2 fraction (FiO2) and lung compliance decreased,while PaCO2 and airway pressure increased in comparison with baseline (P <0.05).The lung histology showed severe inflammation,hyaline membranes,intra-alveolar and interstitial hemorrhage.Conclusion This experiment established a stable model which allows for a diversity of studies on early lung injury.

  7. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups

    Science.gov (United States)

    Capitán, José A.; Manrubia, Susanna

    2015-12-01

    The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.

  8. Exploring predictive and reproducible modeling with the single-subject FIAC dataset.

    Science.gov (United States)

    Chen, Xu; Pereira, Francisco; Lee, Wayne; Strother, Stephen; Mitchell, Tom

    2006-05-01

    Predictive modeling of functional magnetic resonance imaging (fMRI) has the potential to expand the amount of information extracted and to enhance our understanding of brain systems by predicting brain states, rather than emphasizing the standard spatial mapping. Based on the block datasets of Functional Imaging Analysis Contest (FIAC) Subject 3, we demonstrate the potential and pitfalls of predictive modeling in fMRI analysis by investigating the performance of five models (linear discriminant analysis, logistic regression, linear support vector machine, Gaussian naive Bayes, and a variant) as a function of preprocessing steps and feature selection methods. We found that: (1) independent of the model, temporal detrending and feature selection assisted in building a more accurate predictive model; (2) the linear support vector machine and logistic regression often performed better than either of the Gaussian naive Bayes models in terms of the optimal prediction accuracy; and (3) the optimal prediction accuracy obtained in a feature space using principal components was typically lower than that obtained in a voxel space, given the same model and same preprocessing. We show that due to the existence of artifacts from different sources, high prediction accuracy alone does not guarantee that a classifier is learning a pattern of brain activity that might be usefully visualized, although cross-validation methods do provide fairly unbiased estimates of true prediction accuracy. The trade-off between the prediction accuracy and the reproducibility of the spatial pattern should be carefully considered in predictive modeling of fMRI. We suggest that unless the experimental goal is brain-state classification of new scans on well-defined spatial features, prediction alone should not be used as an optimization procedure in fMRI data analysis.

  9. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  10. Can a global model reproduce observed trends in summertime surface ozone levels?

    Directory of Open Access Journals (Sweden)

    S. Koumoutsaris

    2012-01-01

    Full Text Available Quantifying trends in surface ozone concentrations are critical for assessing pollution control strategies. Here we use observations and results from a global chemical transport model to examine the trends (1991–2005 in daily maximum 8-hour average concentrations in summertime surface ozone at rural sites in Europe and the United States. We find a decrease in observed ozone concentrations at the high end of the probability distribution at many of the sites in both regions. The model attributes these trends to a decrease in local anthropogenic ozone precursors, although simulated decreasing trends are overestimated in comparison with observed ones. The low end of observed distribution show small upward trends over Europe and the western US and downward trends in Eastern US. The model cannot reproduce these observed trends, especially over Europe and the western US. In particular, simulated changes between the low and high end of the distributions in these two regions are not significant. Sensitivity simulations indicate that emissions from far away source regions do not affect significantly ozone trends at both ends of the distribution. This is in contrast with previously available results, which indicated that increasing ozone trends at the low percentiles may reflect an increase in ozone background associated with increasing remote sources of ozone precursors. Possible reasons for discrepancies between observed and simulated trends are discussed.

  11. Animal models that best reproduce the clinical manifestations of human intoxication with organophosphorus compounds.

    Science.gov (United States)

    Pereira, Edna F R; Aracava, Yasco; DeTolla, Louis J; Beecham, E Jeffrey; Basinger, G William; Wakayama, Edgar J; Albuquerque, Edson X

    2014-08-01

    The translational capacity of data generated in preclinical toxicological studies is contingent upon several factors, including the appropriateness of the animal model. The primary objectives of this article are: 1) to analyze the natural history of acute and delayed signs and symptoms that develop following an acute exposure of humans to organophosphorus (OP) compounds, with an emphasis on nerve agents; 2) to identify animal models of the clinical manifestations of human exposure to OPs; and 3) to review the mechanisms that contribute to the immediate and delayed OP neurotoxicity. As discussed in this study, clinical manifestations of an acute exposure of humans to OP compounds can be faithfully reproduced in rodents and nonhuman primates. These manifestations include an acute cholinergic crisis in addition to signs of neurotoxicity that develop long after the OP exposure, particularly chronic neurologic deficits consisting of anxiety-related behavior and cognitive deficits, structural brain damage, and increased slow electroencephalographic frequencies. Because guinea pigs and nonhuman primates, like humans, have low levels of circulating carboxylesterases-the enzymes that metabolize and inactivate OP compounds-they stand out as appropriate animal models for studies of OP intoxication. These are critical points for the development of safe and effective therapeutic interventions against OP poisoning because approval of such therapies by the Food and Drug Administration is likely to rely on the Animal Efficacy Rule, which allows exclusive use of animal data as evidence of the effectiveness of a drug against pathologic conditions that cannot be ethically or feasibly tested in humans.

  12. A Semi-Analytic dynamical friction model that reproduces core stalling

    CERN Document Server

    Petts, James A; Read, Justin I

    2015-01-01

    We present a new semi-analytic model for dynamical friction based on Chandrasekhar's formalism. The key novelty is the introduction of physically motivated, radially varying, maximum and minimum impact parameters. With these, our model gives an excellent match to full N-body simulations for isotropic background density distributions, both cuspy and shallow, without any fine-tuning of the model parameters. In particular, we are able to reproduce the dramatic core-stalling effect that occurs in shallow/constant density cores, for the first time. This gives us new physical insight into the core-stalling phenomenon. We show that core stalling occurs in the limit in which the product of the Coulomb logarithm and the local fraction of stars with velocity lower than the infalling body tends to zero. For cuspy backgrounds, this occurs when the infalling mass approaches the enclosed background mass. For cored backgrounds, it occurs at larger distances from the centre, due to a combination of a rapidly increasing minim...

  13. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  14. Reproducibility of summertime diurnal precipitation over northern Eurasia simulated by CMIP5 climate models

    Science.gov (United States)

    Hirota, N.; Takayabu, Y. N.

    2015-12-01

    Reproducibility of diurnal precipitation over northern Eurasia simulated by CMIP5 climate models in their historical runs were evaluated, in comparison with station data (NCDC-9813) and satellite data (GSMaP-V5). We first calculated diurnal cycles by averaging precipitation at each local solar time (LST) in June-July-August during 1981-2000 over the continent of northern Eurasia (0-180E, 45-90N). Then we examined occurrence time of maximum precipitation and a contribution of diurnally varying precipitation to the total precipitation.The contribution of diurnal precipitation was about 21% in both NCDC-9813 and GSMaP-V5. The maximum precipitation occurred at 18LST in NCDC-9813 but 16LST in GSMaP-V5, indicating some uncertainties even in the observational datasets. The diurnal contribution of the CMIP5 models varied largely from 11% to 62%, and their timing of the precipitation maximum ranged from 11LST to 20LST. Interestingly, the contribution and the timing had strong negative correlation of -0.65. The models with larger diurnal precipitation showed precipitation maximum earlier around noon. Next, we compared sensitivity of precipitation to surface temperature and tropospheric humidity between 5 models with large diurnal precipitation (LDMs) and 5 models with small diurnal precipitation (SDMs). Precipitation in LDMs showed high sensitivity to surface temperature, indicating its close relationship with local instability. On the other hand, synoptic disturbances were more active in SDMs with a dominant role of the large scale condensation, and precipitation in SDMs was more related with tropospheric moisture. Therefore, the relative importance of the local instability and the synoptic disturbances was suggested to be an important factor in determining the contribution and timing of the diurnal precipitation. Acknowledgment: This study is supported by Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology

  15. Commentary on the integration of model sharing and reproducibility analysis to scholarly publishing workflow in computational biomechanics.

    Science.gov (United States)

    Erdemir, Ahmet; Guess, Trent M; Halloran, Jason P; Modenese, Luca; Reinbolt, Jeffrey A; Thelen, Darryl G; Umberger, Brian R; Erdemir, Ahmet; Guess, Trent M; Halloran, Jason P; Modenese, Luca; Reinbolt, Jeffrey A; Thelen, Darryl G; Umberger, Brian R; Umberger, Brian R; Erdemir, Ahmet; Thelen, Darryl G; Guess, Trent M; Reinbolt, Jeffrey A; Modenese, Luca; Halloran, Jason P

    2016-10-01

    The overall goal of this paper is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. As part of a special issue on model sharing and reproducibility in the IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and Schmitz and Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers' feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis was not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models.

  16. Fast bootstrapping and permutation testing for assessing reproducibility and interpretability of multivariate fMRI decoding models.

    Directory of Open Access Journals (Sweden)

    Bryan R Conroy

    Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a

  17. Rainfall variability and extremes over southern Africa: Assessment of a climate model to reproduce daily extremes

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  18. Composite model to reproduce the mechanical behaviour of methane hydrate bearing soils

    Science.gov (United States)

    De la Fuente, Maria

    2016-04-01

    Methane hydrate bearing sediments (MHBS) are naturally-occurring materials containing different components in the pores that may suffer phase changes under relative small temperature and pressure variations for conditions typically prevailing a few hundreds of meters below sea level. Their modelling needs to account for heat and mass balance equations of the different components, and several strategies already exist to combine them (e.g., Rutqvist & Moridis, 2009; Sánchez et al. 2014). These equations have to be completed by restrictions and constitutive laws reproducing the phenomenology of heat and fluid flows, phase change conditions and mechanical response. While the formulation of the non-mechanical laws generally includes explicitly the mass fraction of methane in each phase, which allows for a natural update of parameters during phase changes, mechanical laws are, in most cases, stated for the whole solid skeleton (Uchida et al., 2012; Soga et al. 2006). In this paper, a mechanical model is proposed to cope with the response of MHBS. It is based on a composite approach that allows defining the thermo-hydro-mechanical response of mineral skeleton and solid hydrates independently. The global stress-strain-temperature response of the solid phase (grains + hydrate) is then obtained by combining both responses according to energy principle following the work by Pinyol et al. (2007). In this way, dissociation of MH can be assessed on the basis of the stress state and temperature prevailing locally within the hydrate component. Besides, its structuring effect is naturally accounted for by the model according to patterns of MH inclusions within soil pores. This paper describes the fundamental hypothesis behind the model and its formulation. Its performance is assessed by comparison with laboratory data presented in the literature. An analysis of MHBS response to several stress-temperature paths representing potential field cases is finally presented. References

  19. Can a stepwise steady flow computational fluid dynamics model reproduce unsteady particulate matter separation for common unit operations?

    Science.gov (United States)

    Pathapati, Subbu-Srikanth; Sansalone, John J

    2011-07-01

    Computational fluid dynamics (CFD) is emerging as a model for resolving the fate of particulate matter (PM) by unit operations subject to rainfall-runoff loadings. However, compared to steady flow CFD models, there are greater computational requirements for unsteady hydrodynamics and PM loading models. Therefore this study examines if integrating a stepwise steady flow CFD model can reproduce PM separation by common unit operations loaded by unsteady flow and PM loadings, thereby reducing computational effort. Utilizing monitored unit operation data from unsteady events as a metric, this study compares the two CFD modeling approaches for a hydrodynamic separator (HS), a primary clarifier (PC) tank, and a volumetric clarifying filtration system (VCF). Results indicate that while unsteady CFD models reproduce PM separation of each unit operation, stepwise steady CFD models result in significant deviation for HS and PC models as compared to monitored data; overestimating the physical size requirements of each unit required to reproduce monitored PM separation results. In contrast, the stepwise steady flow approach reproduces PM separation by the VCF, a combined gravitational sedimentation and media filtration unit operation that provides attenuation of turbulent energy and flow velocity.

  20. A novel, recovery, and reproducible minimally invasive cardiopulmonary bypass model with lung injury in rats

    Institute of Scientific and Technical Information of China (English)

    LI Ling-ke; CHENG Wei; LIU Dong-hai; ZHANG Jing; ZHU Yao-bin; QIAO Chen-hui; ZHANG Yan-bo

    2013-01-01

    Background Cardiopulmonary bypass (CPB) has been shown to be associated with a systemic inflammatory response leading to postoperative organ dysfunction.Elucidating the underlying mechanisms and developing protective strategies for the pathophysiological consequences of CPB have been hampered due to the absence of a satisfactory recovery animal model.The purpose of this study was to establish a good rat model of CPB to study the pathophysiology of potential complications.Methods Twenty adult male Sprague-Dawley rats weighing 450-560 g were randomly divided into a CPB group (n=10)and a control group (n=10).All rats were anaesthetized and mechanically ventilated.The carotid artery and jugular vein were cannulated.The blood was drained from the dght atrium via the right jugular and transferred by a miniaturized roller pump to a hollow fiber oxygenator and back to the rat via the left carotid artery.Priming consisted of 8 ml of homologous blood and 8 ml of colloid.The surface of the hollow fiber oxygenator was 0.075 m2.CPB was conducted for 60 minutes at a flow rate of 100-120 ml.kg-1.min-1 in the CPB group.Oxygen flow/perfusion flow was 0.8 to 1.0,and the mean arterial pressure remained 60-80 mmHg.Blood gas analysis,hemodynamic investigations,and lung histology were subsequently examined.Results All CPB rats recovered from the operative process without incident.Normal cardiac function after successful weaning was confirmed by electrocardiography and blood pressure measurements.Mean arterial pressure remained stable.The results of blood gas analysis at different times were within the normal range.Levels of IL-1β and TNF-α were higher in the lung tissue in the CPB group (P <0.005).Histological examination revealed marked increases in interstitial congestion,edema,and inflammation in the CPB group.Conclusion This novel,recovery,and reproducible minimally invasive CPB model may open the field for various studies on the pathophysiological process of CPB and systemic

  1. Assessing reproducibility by the within-subject coefficient of variation with random effects models.

    Science.gov (United States)

    Quan, H; Shih, W J

    1996-12-01

    In this paper we consider the use of within-subject coefficient of variation (WCV) for assessing the reproducibility or reliability of a measurement. Application to assessing reproducibility of biochemical markers for measuring bone turnover is described and the comparison with intraclass correlation is discussed. Both maximum likelihood and moment confidence intervals of WCV are obtained through their corresponding asymptotic distributions. Normal and log-normal cases are considered. In general, WCV is preferred when the measurement scale bears intrinsic meaning and is not subject to arbitrary shifting. The intraclass correlation may be preferred when a fixed population of subjects can be well identified.

  2. Modeling and experimental verification for a broad beam light transport in optical tomography.

    Science.gov (United States)

    Janunts, Edgar; Pöschinger, Thomas; Eisa, Fabian; Langenbucher, Achim

    2010-01-01

    This paper describes a general theoretical model for computing a broad beam excitation light transport in a 3D diffusion medium. The model is based on the diffusion approximation of the radiative transport equation. An analytical approach for the light propagation is presented by deriving a corresponding Green's function. A finite cylindrical domain with a rectangular cross section was considered as a 3D homogeneous phantom model. The results of the model are compared with corresponding experimental data. The measurements are done on solid and liquid phantoms replicating tissue-like optical properties.

  3. The Kinematics of Quasar Broad Emission Line Regions Using a Disk-Wind Model

    Science.gov (United States)

    Yong, Suk Yee; Webster, Rachel L.; King, Anthea L.; Bate, Nicholas F.; O'Dowd, Matthew J.; Labrie, Kathleen

    2017-09-01

    The structure and kinematics of the broad line region in quasars are still unknown. One popular model is the disk-wind model that offers a geometric unification of a quasar based on the viewing angle. We construct a simple kinematical disk-wind model with a narrow outflowing wind angle. The model is combined with radiative transfer in the Sobolev, or high velocity, limit. We examine how angle of viewing affects the observed characteristics of the emission line. The line profiles were found to exhibit distinct properties depending on the orientation, wind opening angle, and region of the wind where the emission arises.

  4. Reproducible long-term disc degeneration in a large animal model

    NARCIS (Netherlands)

    Hoogendoorn, R.J.W.; Helder, M.N.; Kroeze, R.J.; Bank, R.A.; Smit, T.H.; Wuisman, P.I.J.M.

    2008-01-01

    STUDY DESIGN. Twelve goats were chemically degenerated and the development of the degenerative signs was followed for 26 weeks to evaluate the progression of the induced degeneration. The results were also compared with a previous study to determine the reproducibility. OBJECTIVES. The purpose of th

  5. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    Science.gov (United States)

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  6. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    NARCIS (Netherlands)

    Candogan Yossef, N.A.N.N.; Beek, L.P.H. van; Kwadijk, J.C.J.; Bierkens, M.F.P.

    2012-01-01

    As an initial step in assessing the prospect of using global hydrological models (GHMs) for hydrological forecasting, this study investigates the skill of the GHM PCRGLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958

  7. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  8. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  9. Reproducibility of scratch assays is affected by the initial degree of confluence: Experiments, modelling and model selection.

    Science.gov (United States)

    Jin, Wang; Shah, Esha T; Penington, Catherine J; McCue, Scott W; Chopin, Lisa K; Simpson, Matthew J

    2016-02-01

    Scratch assays are difficult to reproduce. Here we identify a previously overlooked source of variability which could partially explain this difficulty. We analyse a suite of scratch assays in which we vary the initial degree of confluence (initial cell density). Our results indicate that the rate of re-colonisation is very sensitive to the initial density. To quantify the relative roles of cell migration and proliferation, we calibrate the solution of the Fisher-Kolmogorov model to cell density profiles to provide estimates of the cell diffusivity, D, and the cell proliferation rate, λ. This procedure indicates that the estimates of D and λ are very sensitive to the initial density. This dependence suggests that the Fisher-Kolmogorov model does not accurately represent the details of the collective cell spreading process, since this model assumes that D and λ are constants that ought to be independent of the initial density. Since higher initial cell density leads to enhanced spreading, we also calibrate the solution of the Porous-Fisher model to the data as this model assumes that the cell flux is an increasing function of the cell density. Estimates of D and λ associated with the Porous-Fisher model are less sensitive to the initial density, suggesting that the Porous-Fisher model provides a better description of the experiments.

  10. Diagnostics for the structure of AGNs’broad line regions with reverberation mapping data:confirmation of the two-component broad line region model

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    We re-examine the ten Reverberation Mapping(RM) sources with public data based on the two-component model of the Broad Line Region(BLR).In fitting their broad Hβ Mlines,six of them only need one Gaussian component,one of them has a double-peak profile,one has an irregular profile,and only two of them need two components,i.e.,a Very Broad Gaussian Component(VBGC) and an Inter-Mediate Gaussian Component(IMGC).The Gaussian components are assumed to come from two distinct regions in the two-component model;they are the Very Broad Line Region(VBLR) and the Inter-Mediate Line region(IMLR).The two sources with a two-component profile are Mrk 509 and NGC 4051.The time lags of the two components of both sources satisfy tIMLR/tVBLR=V 2VBLR/V 2IMLR,where tIMLR and tVBLR are the lags of the two components while VIMLR and VVBLR represent the mean gas velocities of the two regions,supporting the two-component model of the BLR of Active Galactic Nuclei(AGNs).The fact that most of these ten sources only have the VBGC confirms the assumption that RM mainly measures the radius of the VBLR;consequently,the radius obtained from the R-L relationship mainly represents the radius of VBLR.Moreover,NGC 4051,with a lag of about 5 days in the one component model,is an outlier on the R-L relationship as shown in Kaspi et al.(2005);however this problem disappears in our two-component model with lags of about 2 and 6 days for the VBGC and IMGC,respectively.

  11. Some problems with reproducing the Standard Model fields and interactions in five-dimensional warped brane world models

    Science.gov (United States)

    Smolyakov, Mikhail N.; Volobuev, Igor P.

    2016-01-01

    In this paper we examine, from the purely theoretical point of view and in a model-independent way, the case, when matter, gauge and Higgs fields are allowed to propagate in the bulk of five-dimensional brane world models with compact extra dimension, and the Standard Model fields and their interactions are supposed to be reproduced by the corresponding zero Kaluza-Klein modes. An unexpected result is that in order to avoid possible pathological behavior in the fermion sector, it is necessary to impose constraints on the fermion field Lagrangian. In the case when the fermion zero modes are supposed to be localized at one of the branes, these constraints imply an additional relation between the vacuum profile of the Higgs field and the form of the background metric. Moreover, this relation between the vacuum profile of the Higgs field and the form of the background metric results in the exact reproduction of the gauge boson and fermion sectors of the Standard Model by the corresponding zero mode four-dimensional effective theory in all the physically relevant cases, allowed by the absence of pathologies. Meanwhile, deviations from these conditions can lead either back to pathological behavior in the fermion sector or to a variance between the resulting zero mode four-dimensional effective theory and the Standard Model, which, depending on the model at hand, may, in principle, result in constraints putting the theory out of the reach of the present day experiments.

  12. Constraining the Absolute Orientation of Eta Carinae's Binary Orbit: A 3-D Dynamical Model for the Broad [Fe III] Emission

    CERN Document Server

    Madura, Thomas I; Owocki, Stanley P; Groh, Jose H; Okazaki, Atsuo T; Russell, Christopher M P

    2011-01-01

    We present a three-dimensional (3-D) dynamical model for the broad [Fe III] emission observed in Eta Carinae using the Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS). This model is based on full 3-D Smoothed Particle Hydrodynamics (SPH) simulations of Eta Car's binary colliding winds. Radiative transfer codes are used to generate synthetic spectro-images of [Fe III] emission line structures at various observed orbital phases and STIS slit position angles (PAs). Through a parameter study that varies the orbital inclination i, the PA {\\theta} that the orbital plane projection of the line-of-sight makes with the apastron side of the semi-major axis, and the PA on the sky of the orbital axis, we are able, for the first time, to tightly constrain the absolute 3-D orientation of the binary orbit. To simultaneously reproduce the blue-shifted emission arcs observed at orbital phase 0.976, STIS slit PA = +38 degrees, and the temporal variations in emission seen at negative slit PAs, the binary ...

  13. Elusive reproducibility.

    Science.gov (United States)

    Gori, Gio Batta

    2014-08-01

    Reproducibility remains a mirage for many biomedical studies because inherent experimental uncertainties generate idiosyncratic outcomes. The authentication and error rates of primary empirical data are often elusive, while multifactorial confounders beset experimental setups. Substantive methodological remedies are difficult to conceive, signifying that many biomedical studies yield more or less plausible results, depending on the attending uncertainties. Real life applications of those results remain problematic, with important exceptions for counterfactual field validations of strong experimental signals, notably for some vaccines and drugs, and for certain safety and occupational measures. It is argued that industrial, commercial and public policies and regulations could not ethically rely on unreliable biomedical results; rather, they should be rationally grounded on transparent cost-benefit tradeoffs.

  14. Amplifying modeling for broad bandwidth pulse in Nd:glass based on hybrid-broaden mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Sujingqin; Lanqin, L; Wenyi, W; Feng, J; Xiaofeng, W; Xiaomin, Z [Research Center of Laser Fusion, China Academy of Engineering Physics, P. O. Box 919-988, Mianyang, China, 621900 (China); Bin, L [School of Computer and Communication Engineering, Southwest Jiaotong University, Chengdu. China, 610031 (China)], E-mail: sujingqin@tom.com

    2008-05-15

    In this paper, the cross relaxation time is proposed to combine the homogeneous and inhomogeneous broaden mechanism for broad bandwidth pulse amplification model. The corresponding velocity equation, which can describe the response of inverse population on upper and low energy level of gain media to different frequency of pulse, is also put forward. The gain saturation and energy relaxation effect are also included in the velocity equation. Code named CPAP has been developed to simulate the amplifying process of broad bandwidth pulse in multi-pass laser system. The amplifying capability of multi-pass laser system is evaluated and gain narrowing and temporal shape distortion are also investigated when bandwidth of pulse and cross relaxation time of gain media are different. Results can benefit the design of high-energy PW laser system in LFRC, CAEP.

  15. Impact of soil parameter and physical process on reproducibility of hydrological processes by land surface model in semiarid grassland

    Science.gov (United States)

    Miyazaki, S.; Yorozu, K.; Asanuma, J.; Kondo, M.; Saito, K.

    2014-12-01

    The land surface model (LSM) takes part in the land-atmosphere interaction on the earth system model for the climate change research. In this study, we evaluated the impact of soil parameters and physical process on reproducibility of hydrological process by LSM Minimal Advanced Treatments of Surface Interaction and RunOff (MATSIRO; Takata et al, 2003, GPC) forced by the meteorological data observed at grassland in semiarid climate in China and Mongolia. The testing of MATSIRO was carried out offline mode over the semiarid grassland sites at Tongyu (44.42 deg. N, 122.87 deg. E, altitude: 184m) in China, Kherlen Bayan Ulaan (KBU; 47.21 deg. N, 108.74 deg. E, altitude: 1235m) and Arvaikheer (46.23 N, 102.82E, altitude: 1,813m) in Mongolia. Although all sites locate semiarid grassland, the climate condition is different among sites, which the annual air temperature and precipitation are 5.7 deg. C and 388mm (Tongyu), 1.2 deg.C and 180mm (KBU), and 0.4 deg. C and 245mm(Arvaikheer). We can evaluate the effect of climate condition on the model performance. Three kinds of experiments have been carried out, which was run with the default parameters (CTL), the observed parameters (OBS) for soil physics and hydrology, and vegetation, and refined MATSIRO with the effect of ice in thermal parameters and unfrozen water below the freezing with same parameters as OBS run (OBSr). The validation data has been provided by CEOP(http://www.ceop.net/) , RAISE(http://raise.suiri.tsukuba.ac.jp/), GAME-AAN (Miyazaki et al., 2004, JGR) for Tongyu, KBU, and Arvaikheer, respectively. The reproducibility of the net radiation, the soil temperature (Ts), and latent heat flux (LE) were well reproduced by OBS and OBSr run. The change of soil physical and hydraulic parameter affected the reproducibility of soil temperature (Ts) and soil moisture (SM) as well as energy flux component especially for the sensible heat flux (H) and soil heat flux (G). The reason for the great improvement on the

  16. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  17. How well do CMIP5 climate models reproduce explosive cyclones in the extratropics of the Northern Hemisphere?

    Science.gov (United States)

    Seiler, C.; Zwiers, F. W.

    2016-02-01

    Extratropical explosive cyclones are rapidly intensifying low pressure systems with severe wind speeds and heavy precipitation, affecting livelihoods and infrastructure primarily in coastal and marine environments. This study evaluates how well the most recent generation of climate models reproduces extratropical explosive cyclones in the Northern Hemisphere for the period 1980-2005. An objective-feature tracking algorithm is used to identify and track cyclones from 25 climate models and three reanalysis products. Model biases are compared to biases in the sea surface temperature (SST) gradient, the polar jet stream, the Eady growth rate, and model resolution. Most models accurately reproduce the spatial distribution of explosive cyclones when compared to reanalysis data ( R = 0.94), with high frequencies along the Kuroshio Current and the Gulf Stream. Three quarters of the models however significantly underpredict explosive cyclone frequencies, by a third on average and by two thirds in the worst case. This frequency bias is significantly correlated with jet stream speed in the inter-model spread ( R ≥ 0.51), which in the Atlantic is correlated with a negative meridional SST gradient ( R = -0.56). The importance of the jet stream versus other variables considered in this study also applies to the interannual variability of explosive cyclone frequency. Furthermore, models with fewer explosive cyclones tend to underpredict the corresponding deepening rates ( R ≥ 0.88). A follow-up study will assess the impacts of climate change on explosive cyclones, and evaluate how model biases presented in this study affect the projections.

  18. Spatially nondegenerate four-wave mixing in a broad area semiconductor laser: Modeling

    DEFF Research Database (Denmark)

    Jensen, Søren Blaaberg; Tromborg, Bjarne; Petersen, P. M.

    coupled equations for the field components in the cavity and a rate equation is used to describe the carrier density of the semiconductor material. The interference pattern of the four field components inside the cavity induces a periodic spatial modulation of the carrier density and thus of the complex......We present a numerical model of spatially nondegenerate four-wave mixing in a bulk broad area semiconductor laser with an external reflector and a spatial filter. The external reflector provides a feedback with an off-aixs direction of propagation. Such a configuration has experimentally been seen...

  19. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    that resulted in bruises were inflicted on the back. In addition, 2 control pigs were included in the study. The pigs were euthanized consecutively from 1 to 10 h after the infliction of bruises. Following gross evaluation, skin, and muscle tissues were sampled for histology. Results Grossly, the bruises...... appeared uniform and identical to the tramline bruises seen in humans and pigs subjected to blunt trauma. Histologically, the number of neutrophils in the subcutis, the number of macrophages in the muscle tissue, and the localization of neutrophils and macrophages in muscle tissue showed a time...... in order to identify gross and histological parameters that may be useful in determining the age of a bruise. Methods The mechanical device was able to apply a single reproducible stroke with a plastic tube that was equivalent to being struck by a man. In each of 10 anesthetized pigs, four strokes...

  20. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    Science.gov (United States)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  1. Constraining the Absolute Orientation of eta Carinae's Binary Orbit: A 3-D Dynamical Model for the Broad [Fe III] Emission

    Science.gov (United States)

    Madura, T. I.; Gull, T. R.; Owocki, S. P.; Groh, J. H.; Okazaki, A. T.; Russell, C. M. P.

    2011-01-01

    We present a three-dimensional (3-D) dynamical model for the broad [Fe III] emission observed in Eta Carinae using the Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS). This model is based on full 3-D Smoothed Particle Hydrodynamics (SPH) simulations of Eta Car's binary colliding winds. Radiative transfer codes are used to generate synthetic spectro-images of [Fe III] emission line structures at various observed orbital phases and STIS slit position angles (PAs). Through a parameter study that varies the orbital inclination i, the PA(theta) that the orbital plane projection of the line-of-sight makes with the apastron side of the semi-major axis, and the PA on the sky of the orbital axis, we are able, for the first time, to tightly constrain the absolute 3-D orientation of the binary orbit. To simultaneously reproduce the blue-shifted emission arcs observed at orbital phase 0.976, STIS slit PA = +38deg, and the temporal variations in emission seen at negative slit PAs, the binary needs to have an i approx. = 130deg to 145deg, Theta approx. = -15deg to +30deg, and an orbital axis projected on the sky at a P A approx. = 302deg to 327deg east of north. This represents a system with an orbital axis that is closely aligned with the inferred polar axis of the Homunculus nebula, in 3-D. The companion star, Eta(sub B), thus orbits clockwise on the sky and is on the observer's side of the system at apastron. This orientation has important implications for theories for the formation of the Homunculus and helps lay the groundwork for orbital modeling to determine the stellar masses.

  2. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  3. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  4. [Applicability analysis of spatially explicit model of leaf litter in evergreen broad-leaved forests].

    Science.gov (United States)

    Zhao, Qing-Qing; Liu, He-Ming; Jonard, Mathieu; Wang, Zhang-Hua; Wang, Xi-Hua

    2014-11-01

    The spatially explicit model of leaf litter can help to understand its dispersal process, which is very important to predict the distribution pattern of leaves on the surface of the earth. In this paper, the spatially explicit model of leaf litter was developed for 20 tree species using litter trap data from the mapped forest plot in an evergreen broad-leaved forest in Tiantong, Zhejiang Pro- vince, eastern China. Applicability of the model was analyzed. The model assumed an allometric equation between diameter at breast height (DBH) and leaf litter amount, and the leaf litter declined exponentially with the distance. Model parameters were estimated by the maximum likelihood method. Results showed that the predicted and measured leaf litter amounts were significantly correlated, but the prediction accuracies varied widely for the different tree species, averaging at 49.3% and ranging from 16.0% and 74.0%. Model qualities of tree species significantly correlated with the standard deviations of the leaf litter amount per trap, DBH of the tree species and the average leaf dry mass of tree species. There were several ways to improve the forecast precision of the model, such as installing the litterfall traps according to the distribution of the tree to cover the different classes of the DBH and distance apart from the parent trees, determining the optimal dispersal function of each tree species, and optimizing the existing dispersal function.

  5. Nanoporous Anodic Alumina 3D FDTD Modelling for a Broad Range of Inter-pore Distances

    Science.gov (United States)

    Bertó-Roselló, Francesc; Xifré-Pérez, Elisabet; Ferré-Borrull, Josep; Pallarès, Josep; Marsal, Lluis F.

    2016-08-01

    The capability of the finite difference time domain (FDTD) method for the numerical modelling of the optical properties of nanoporous anodic alumina (NAA) in a broad range of inter-pore distances is evaluated. FDTD permits taking into account in the same numerical framework all the structural features of NAA, such as the texturization of the interfaces or the incorporation of electrolyte anions in the aluminium oxide host. The evaluation is carried out by comparing reflectance measurements from two samples with two very different inter-pore distances with the simulation results. Results show that considering the texturization is crucial to obtain good agreement with the measurements. On the other hand, including the anionic layer in the model leads to a second-order contribution to the reflectance spectrum.

  6. Cellular automaton model with dynamical 2D speed-gap relation reproduces empirical and experimental features of traffic flow

    CERN Document Server

    Tian, Junfang; Ma, Shoufeng; Zhu, Chenqiang; Jiang, Rui; Ding, YaoXian

    2015-01-01

    This paper proposes an improved cellular automaton traffic flow model based on the brake light model, which takes into account that the desired time gap of vehicles is remarkably larger than one second. Although the hypothetical steady state of vehicles in the deterministic limit corresponds to a unique relationship between speeds and gaps in the proposed model, the traffic states of vehicles dynamically span a two-dimensional region in the plane of speed versus gap, due to the various randomizations. It is shown that the model is able to well reproduce (i) the free flow, synchronized flow, jam as well as the transitions among the three phases; (ii) the evolution features of disturbances and the spatiotemporal patterns in a car-following platoon; (iii) the empirical time series of traffic speed obtained from NGSIM data. Therefore, we argue that a model can potentially reproduce the empirical and experimental features of traffic flow, provided that the traffic states are able to dynamically span a 2D speed-gap...

  7. Reproducible ion-current-based approach for 24-plex comparison of the tissue proteomes of hibernating versus normal myocardium in swine models.

    Science.gov (United States)

    Qu, Jun; Young, Rebeccah; Page, Brian J; Shen, Xiaomeng; Tata, Nazneen; Li, Jun; Duan, Xiaotao; Fallavollita, James A; Canty, John M

    2014-05-02

    Hibernating myocardium is an adaptive response to repetitive myocardial ischemia that is clinically common, but the mechanism of adaptation is poorly understood. Here we compared the proteomes of hibernating versus normal myocardium in a porcine model with 24 biological replicates. Using the ion-current-based proteomic strategy optimized in this study to expand upon previous proteomic work, we identified differentially expressed proteins in new molecular pathways of cardiovascular interest. The methodological strategy includes efficient extraction with detergent cocktail; precipitation/digestion procedure with high, quantitative peptide recovery; reproducible nano-LC/MS analysis on a long, heated column packed with small particles; and quantification based on ion-current peak areas. Under the optimized conditions, high efficiency and reproducibility were achieved for each step, which enabled a reliable comparison of 24 the myocardial samples. To achieve confident discovery of differentially regulated proteins in hibernating myocardium, we used highly stringent criteria to define "quantifiable proteins". These included the filtering criteria of low peptide FDR and S/N > 10 for peptide ion currents, and each protein was quantified independently from ≥2 distinct peptides. For a broad methodological validation, the quantitative results were compared with a parallel, well-validated 2D-DIGE analysis of the same model. Excellent agreement between the two orthogonal methods was observed (R = 0.74), and the ion-current-based method quantified almost one order of magnitude more proteins. In hibernating myocardium, 225 significantly altered proteins were discovered with a low false-discovery rate (∼3%). These proteins are involved in biological processes including metabolism, apoptosis, stress response, contraction, cytoskeleton, transcription, and translation. This provides compelling evidence that hibernating myocardium adapts to chronic ischemia. The major metabolic

  8. An exact arithmetic toolbox for a consistent and reproducible structural analysis of metabolic network models.

    Science.gov (United States)

    Chindelevitch, Leonid; Trigg, Jason; Regev, Aviv; Berger, Bonnie

    2014-10-07

    Constraint-based models are currently the only methodology that allows the study of metabolism at the whole-genome scale. Flux balance analysis is commonly used to analyse constraint-based models. Curiously, the results of this analysis vary with the software being run, a situation that we show can be remedied by using exact rather than floating-point arithmetic. Here we introduce MONGOOSE, a toolbox for analysing the structure of constraint-based metabolic models in exact arithmetic. We apply MONGOOSE to the analysis of 98 existing metabolic network models and find that the biomass reaction is surprisingly blocked (unable to sustain non-zero flux) in nearly half of them. We propose a principled approach for unblocking these reactions and extend it to the problems of identifying essential and synthetic lethal reactions and minimal media. Our structural insights enable a systematic study of constraint-based metabolic models, yielding a deeper understanding of their possibilities and limitations.

  9. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease.

    Science.gov (United States)

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S; Kovács, Attila D; Meyerholz, David K; Trantzas, Constantin; Lambertz, Allyn M; Darbro, Benjamin W; Weber, Krystal L; White, Katherine A M; Rheeden, Richard V; Kruer, Michael C; Dacken, Brian A; Wang, Xiao-Jun; Davis, Bryan T; Rohret, Judy A; Struzynski, Jason T; Rohret, Frank A; Weimer, Jill M; Pearce, David A

    2015-11-15

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the linkage of ATM with AT onset, the mechanisms linking ATM to neurodegeneration remain undetermined, hindering therapeutic development. Several murine models of AT have been successfully generated showing some of the clinical manifestations of the disease, however they do not fully recapitulate the hallmark neurological phenotype, thus highlighting the need for a more suitable animal model. We engineered a novel porcine model of AT to better phenocopy the disease and bridge the gap between human and current animal models. The initial characterization of AT pigs revealed early cerebellar lesions including loss of Purkinje cells (PCs) and altered cytoarchitecture suggesting a developmental etiology for AT and could advocate for early therapies for AT patients. In addition, similar to patients, AT pigs show growth retardation and develop motor deficit phenotypes. By using the porcine system to model human AT, we established the first animal model showing PC loss and motor features of the human disease. The novel AT pig provides new opportunities to unmask functions and roles of ATM in AT disease and in physiological conditions.

  10. Endovascular Broad-Neck Aneurysm Creation in a Porcine Model Using a Vascular Plug

    Energy Technology Data Exchange (ETDEWEB)

    Muehlenbruch, Georg, E-mail: gmuehlenbruch@ukaachen.de; Nikoubashman, Omid; Steffen, Bjoern; Dadak, Mete [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany); Palmowski, Moritz [RWTH Aachen University, Department of Nuclear Medicine, University Hospital (Germany); Wiesmann, Martin [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany)

    2013-02-15

    Ruptured cerebral arterial aneurysms require prompt treatment by either surgical clipping or endovascular coiling. Training for these sophisticated endovascular procedures is essential and ideally performed in animals before their use in humans. Simulators and established animal models have shown drawbacks with respect to degree of reality, size of the animal model and aneurysm, or time and effort needed for aneurysm creation. We therefore aimed to establish a realistic and readily available aneurysm model. Five anticoagulated domestic pigs underwent endovascular intervention through right femoral access. A total of 12 broad-neck aneurysms were created in the carotid, subclavian, and renal arteries using the Amplatzer vascular plug. With dedicated vessel selection, cubic, tubular, and side-branch aneurysms could be created. Three of the 12 implanted occluders, two of them implanted over a side branch of the main vessel, did not induce complete vessel occlusion. However, all aneurysms remained free of intraluminal thrombus formation and were available for embolization training during a surveillance period of 6 h. Two aneurysms underwent successful exemplary treatment: one was stent-assisted, and one was performed with conventional endovascular coil embolization. The new porcine aneurysm model proved to be a straightforward approach that offers a wide range of training and scientific applications that might help further improve endovascular coil embolization therapy in patients with cerebral aneurysms.

  11. Can a global model reproduce observed trends in summertime surface ozone levels?

    OpenAIRE

    S. Koumoutsaris; I. Bey

    2012-01-01

    Quantifying trends in surface ozone concentrations are critical for assessing pollution control strategies. Here we use observations and results from a global chemical transport model to examine the trends (1991–2005) in daily maximum 8-hour average concentrations in summertime surface ozone at rural sites in Europe and the United States. We find a decrease in observed ozone concentrations at the high end of the probability distribution at many of the sites in both regions. The model attribut...

  12. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  13. Augmenting a Large-Scale Hydrology Model to Reproduce Groundwater Variability

    Science.gov (United States)

    Stampoulis, D.; Reager, J. T., II; Andreadis, K.; Famiglietti, J. S.

    2016-12-01

    To understand the influence of groundwater on terrestrial ecosystems and society, global assessment of groundwater temporal fluctuations is required. A water table was initialized in the Variable Infiltration Capacity (VIC) hydrologic model in a semi-realistic approach to account for groundwater variability. Global water table depth data derived from observations at nearly 2 million well sites compiled from government archives and published literature, as well as groundwater model simulations, were used to create a new soil layer of varying depth for each model grid cell. The new 4-layer version of VIC, hereafter named VIC-4L, was run with and without assimilating NASA's Gravity Recovery and Climate Experiment (GRACE) observations. The results were compared with simulations using the original VIC version (named VIC-3L) with GRACE assimilation, while all runs were compared with well data.

  14. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel;

    2014-01-01

    Air and nutrient emissions from swine operations raise environmental concerns. During the reproduction phase, sows consume and excrete large quantities of nutrients. The objective of this study was to develop a mathematical model to describe energy and nutrient partitioning and predict manure...... excretion and composition and methane emissions on a daily basis. The model was structured to contain gestation and lactation modules, which can be run separately or sequentially, with outputs from the gestation module used as inputs to the lactation module. In the gestating module, energy and protein...... production, and maternal growth with body tissue losses constrained within biological limits. Global sensitivity analysis showed that nonlinearity in the parameters was small. The model outputs considered were the total protein and fat deposition, average urinary and fecal N excretion, average methane...

  15. An exponent tunable network model for reproducing density driven superlinear relation

    CERN Document Server

    Qin, Yuhao; Xu, Lida; Gao, Zi-You

    2014-01-01

    Previous works have shown the universality of allometric scalings under density and total value at city level, but our understanding about the size effects of regions on them is still poor. Here, we revisit the scaling relations between gross domestic production (GDP) and population (POP) under total and density value. We first reveal that the superlinear scaling is a general feature under density value crossing different regions. The scaling exponent $\\beta$ under density value falls into the range $(1.0, 2.0]$, which unexpectedly goes beyond the range observed by Pan et al. (Nat. Commun. vol. 4, p. 1961 (2013)). To deal with the wider range, we propose a network model based on 2D lattice space with the spatial correlation factor $\\alpha$ as parameter. Numerical experiments prove that the generated scaling exponent $\\beta$ in our model is fully tunable by the spatial correlation factor $\\alpha$. We conjecture that our model provides a general platform for extensive urban and regional studies.

  16. A simple branching model that reproduces language family and language population distributions

    Science.gov (United States)

    Schwämmle, Veit; de Oliveira, Paulo Murilo Castro

    2009-07-01

    Human history leaves fingerprints in human languages. Little is known about language evolution and its study is of great importance. Here we construct a simple stochastic model and compare its results to statistical data of real languages. The model is based on the recent finding that language changes occur independently of the population size. We find agreement with the data additionally assuming that languages may be distinguished by having at least one among a finite, small number of different features. This finite set is also used in order to define the distance between two languages, similarly to linguistics tradition since Swadesh.

  17. An exact arithmetic toolbox for a consistent and reproducible structural analysis of metabolic network models

    National Research Council Canada - National Science Library

    Chindelevitch, Leonid; Trigg, Jason; Regev, Aviv; Berger, Bonnie

    2014-01-01

    .... Flux balance analysis is commonly used to analyse constraint-based models. Curiously, the results of this analysis vary with the software being run, a situation that we show can be remedied by using exact rather than floating-point arithmetic...

  18. Reproducible infection model for Clostridium perfringens in broiler chickens

    DEFF Research Database (Denmark)

    Pedersen, Karl; Friis-Holm, Lotte Bjerrum; Heuer, Ole Eske

    2008-01-01

    Experiments were carried out to establish an infection and disease model for Clostridium perfringens in broiler chickens. Previous experiments had failed to induce disease and only a transient colonization with challenge strains had been obtained. In the present study, two series of experiments w...

  19. Establishing a Reproducible Hypertrophic Scar following Thermal Injury: A Porcine Model

    Directory of Open Access Journals (Sweden)

    Scott J. Rapp, MD

    2015-02-01

    Conclusions: Deep partial-thickness thermal injury to the back of domestic swine produces an immature hypertrophic scar by 10 weeks following burn with thickness appearing to coincide with the location along the dorsal axis. With minimal pig to pig variation, we describe our technique to provide a testable immature scar model.

  20. Accuracy and reproducibility of dental measurements on tomographic digital models: a systematic review and meta-analysis.

    Science.gov (United States)

    Ferreira, Jamille B; Christovam, Ilana O; Alencar, David S; da Motta, Andréa F J; Mattos, Claudia T; Cury-Saramago, Adriana

    2017-04-26

    The aim of this systematic review with meta-analysis was to assess the accuracy and reproducibility of dental measurements obtained from digital study models generated from CBCT compared with those acquired from plaster models. The electronic databases Cochrane Library, Medline (via PubMed), Scopus, VHL, Web of Science, and System for Information on Grey Literature in Europe were screened to identify articles from 1998 until February 2016. The inclusion criteria were: prospective and retrospective clinical trials in humans; validation and/or comparison articles of dental study models obtained from CBCT and plaster models; and articles that used dental linear measurements as an assessment tool. The methodological quality of the studies was carried out by Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool. A meta-analysis was performed to validate all comparative measurements. The databases search identified a total of 3160 items and 554 duplicates were excluded. After reading titles and abstracts, 12 articles were selected. Five articles were included after reading in full. The methodological quality obtained through QUADAS-2 was poor to moderate. In the meta-analysis, there were statistical differences between the mesiodistal widths of mandibular incisors, maxillary canines and premolars, and overall Bolton analysis. Therefore, the measurements considered accurate were maxillary and mandibular crowding, intermolar width and mesiodistal width of maxillary incisors, mandibular canines and premolars, in both arches for molars. Digital models obtained from CBCT were not accurate for all measures assessed. The differences were clinically acceptable for all dental linear measurements, except for maxillary arch perimeter. Digital models are reproducible for all measurements when intraexaminer assessment is considered and need improvement in interexaminer evaluation.

  1. Validation and reproducibility assessment of modality independent elastography in a pre-clinical model of breast cancer

    Science.gov (United States)

    Weis, Jared A.; Kim, Dong K.; Yankeelov, Thomas E.; Miga, Michael I.

    2014-03-01

    Clinical observations have long suggested that cancer progression is accompanied by extracellular matrix remodeling and concomitant increases in mechanical stiffness. Due to the strong association of mechanics and tumor progression, there has been considerable interest in incorporating methodologies to diagnose cancer through the use of mechanical stiffness imaging biomarkers, resulting in commercially available US and MR elastography products. Extension of this approach towards monitoring longitudinal changes in mechanical properties along a course of cancer therapy may provide means for assessing early response to therapy; therefore a systematic study of the elasticity biomarker in characterizing cancer for therapeutic monitoring is needed. The elastography method we employ, modality independent elastography (MIE), can be described as a model-based inverse image-analysis method that reconstructs elasticity images using two acquired image volumes in a pre/post state of compression. In this work, we present preliminary data towards validation and reproducibility assessment of our elasticity biomarker in a pre-clinical model of breast cancer. The goal of this study is to determine the accuracy and reproducibility of MIE and therefore the magnitude of changes required to determine statistical differences during therapy. Our preliminary results suggest that the MIE method can accurately and robustly assess mechanical properties in a pre-clinical system and provide considerable enthusiasm for the extension of this technique towards monitoring therapy-induced changes to breast cancer tissue architecture.

  2. Experimental and Numerical Models of Complex Clinical Scenarios; Strategies to Improve Relevance and Reproducibility of Joint Replacement Research.

    Science.gov (United States)

    Bechtold, Joan E; Swider, Pascal; Goreham-Voss, Curtis; Soballe, Kjeld

    2016-02-01

    This research review aims to focus attention on the effect of specific surgical and host factors on implant fixation, and the importance of accounting for them in experimental and numerical models. These factors affect (a) eventual clinical applicability and (b) reproducibility of findings across research groups. Proper function and longevity for orthopedic joint replacement implants relies on secure fixation to the surrounding bone. Technology and surgical technique has improved over the last 50 years, and robust ingrowth and decades of implant survival is now routinely achieved for healthy patients and first-time (primary) implantation. Second-time (revision) implantation presents with bone loss with interfacial bone gaps in areas vital for secure mechanical fixation. Patients with medical comorbidities such as infection, smoking, congestive heart failure, kidney disease, and diabetes have a diminished healing response, poorer implant fixation, and greater revision risk. It is these more difficult clinical scenarios that require research to evaluate more advanced treatment approaches. Such treatments can include osteogenic or antimicrobial implant coatings, allo- or autogenous cellular or tissue-based approaches, local and systemic drug delivery, surgical approaches. Regarding implant-related approaches, most experimental and numerical models do not generally impose conditions that represent mechanical instability at the implant interface, or recalcitrant healing. Many treatments will work well in forgiving settings, but fail in complex human settings with disease, bone loss, or previous surgery. Ethical considerations mandate that we justify and limit the number of animals tested, which restricts experimental permutations of treatments. Numerical models provide flexibility to evaluate multiple parameters and combinations, but generally need to employ simplifying assumptions. The objectives of this paper are to (a) to highlight the importance of mechanical

  3. Evaluation of Nitinol staples for the Lapidus arthrodesis in a reproducible biomechanical model

    Directory of Open Access Journals (Sweden)

    Nicholas Alexander Russell

    2015-12-01

    Full Text Available While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study was to evaluate the biomechanical properties of new Shape Memory Alloy staples arranged in different configurations in a repeatable 1st Tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n=5 per group were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested in dorsal four-point bending, medial four-point bending, dorsal three-point bending and plantar cantilever bending with the staples activated at 37°C. The peak load, stiffness and plantar gapping were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a significant (p < 0.05 increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero and contact area following loading in the two staple constructs (p < 0.05. Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. Shape memory alloy staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  4. Reproducibility of the heat/capsaicin skin sensitization model in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Cavallone LF

    2013-11-01

    Full Text Available Laura F Cavallone,1 Karen Frey,1 Michael C Montana,1 Jeremy Joyal,1 Karen J Regina,1 Karin L Petersen,2 Robert W Gereau IV11Department of Anesthesiology, Washington University in St Louis, School of Medicine, St Louis, MO, USA; 2California Pacific Medical Center Research Institute, San Francisco, CA, USAIntroduction: Heat/capsaicin skin sensitization is a well-characterized human experimental model to induce hyperalgesia and allodynia. Using this model, gabapentin, among other drugs, was shown to significantly reduce cutaneous hyperalgesia compared to placebo. Since the larger thermal probes used in the original studies to produce heat sensitization are now commercially unavailable, we decided to assess whether previous findings could be replicated with a currently available smaller probe (heated area 9 cm2 versus 12.5–15.7 cm2.Study design and methods: After Institutional Review Board approval, 15 adult healthy volunteers participated in two study sessions, scheduled 1 week apart (Part A. In both sessions, subjects were exposed to the heat/capsaicin cutaneous sensitization model. Areas of hypersensitivity to brush stroke and von Frey (VF filament stimulation were measured at baseline and after rekindling of skin sensitization. Another group of 15 volunteers was exposed to an identical schedule and set of sensitization procedures, but, in each session, received either gabapentin or placebo (Part B.Results: Unlike previous reports, a similar reduction of areas of hyperalgesia was observed in all groups/sessions. Fading of areas of hyperalgesia over time was observed in Part A. In Part B, there was no difference in area reduction after gabapentin compared to placebo.Conclusion: When using smaller thermal probes than originally proposed, modifications of other parameters of sensitization and/or rekindling process may be needed to allow the heat/capsaicin sensitization protocol to be used as initially intended. Standardization and validation of

  5. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  6. The link between the Barents Sea and ENSO events reproduced by NEMO model

    Directory of Open Access Journals (Sweden)

    V. N. Stepanov

    2012-05-01

    Full Text Available An analysis of observational data in the Barents Sea along a meridian at 33°30´ E between 70°30´ and 72°30´ N has reported a negative correlation between El Niño/La Niña-Southern Oscillation (ENSO events and water temperature in the top 200 m: the temperature drops about 0.5 °C during warm ENSO events while during cold ENSO events the top 200 m layer of the Barents Sea is warmer. Results from 1 and 1/4-degree global NEMO models show a similar response for the whole Barents Sea. During the strong warm ENSO event in 1997–1998 an anticyclonic atmospheric circulation is settled over the Barents Sea instead of a usual cyclonic circulation. This change enhances heat loses in the Barents Sea, as well as substantially influencing the Barents Sea inflow from the North Atlantic, via changes in ocean currents. Under normal conditions along the Scandinavian peninsula there is a warm current entering the Barents sea from the North Atlantic, however after the 1997–1998 event this current is weakened.

    During 1997–1998 the model annual mean temperature in the Barents Sea is decreased by about 0.8 °C, also resulting in a higher sea ice volume. In contrast during the cold ENSO events in 1999–2000 and 2007–2008 the model shows a lower sea ice volume, and higher annual mean temperatures in the upper layer of the Barents Sea of about 0.7 °C.

    An analysis of model data shows that the Barents Sea inflow is the main source for the variability of Barents Sea heat content, and is forced by changing pressure and winds in the North Atlantic. However, surface heat-exchange with atmosphere can also play a dominant role in the Barents Sea annual heat balance, especially for the subsequent year after ENSO events.

  7. ISO observations and models of galaxies with Hidden Broad Line Regions

    CERN Document Server

    Efstathiou, A

    2005-01-01

    In this paper we present ISO mid-infrared spectrophotometry and far-infrared photometry of galaxies with Hidden Broad Line Regions (HBLR). We also present radiative transfer models of their spectral energy distributions which enable us to separate the contributions from the dusty disc of the AGN and the dusty starbursts. We find that the combination of tapered discs (discs whose thickness increases with distance from the central source in the inner part but stays constant in the outer part) and starbursts provide good fits to the data. The tapered discs dominate in the mid-infrared part of the spectrum and the starbursts in the far-infrared. After correcting the AGN luminosity for anisotropic emission we find that the ratio of the AGN luminosity to the starburst luminosity, L(AGN)/L(SB), ranges from about unity for IRAS14454-4343 to about 13 for IRAS01475-0740. Our results suggest that the warm IRAS colours of HBLR are due to the relatively high L(AGN)/L(SB). Our fits are consistent with the unified model and...

  8. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  9. Can a global model chemical mechanism reproduce NO, NO2, and O3 measurements above a tropical rainforest?

    Directory of Open Access Journals (Sweden)

    C. N. Hewitt

    2009-12-01

    Full Text Available A cross-platform field campaign, OP3, was conducted in the state of Sabah in Malaysian Borneo between April and July of 2008. Among the suite of observations recorded, the campaign included measurements of NOx and O3–crucial outputs of any model chemistry mechanism. We describe the measurements of these species made from both the ground site and aircraft. We examine the output from the global model p-TOMCAT at two resolutions for this location during the April campaign period. The models exhibit reasonable ability in capturing the NOx diurnal cycle, but ozone is overestimated. We use a box model containing the same chemical mechanism to explore the weaknesses in the global model and the ability of the simplified global model chemical mechanism to capture the chemistry at the rainforest site. We achieve a good fit to the data for all three species (NO, NO2, and O3, though the model is much more sensitive to changes in the treatment of physical processes than to changes in the chemical mechanism. Indeed, without some parameterization of the nighttime boundary layer-free troposphere mixing, a time dependent box model will not reproduce the observations. The final simulation uses this mixing parameterization for NO and NO2 but not O3, as determined by the vertical structure of each species, and matches the measurements well.

  10. Can model observers be developed to reproduce radiologists' diagnostic performances? Our study says not so fast!

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M.; Reiser, Ingrid; Boone, John M.

    2016-03-01

    The purpose of this study was to determine radiologists' diagnostic performances on different image reconstruction algorithms that could be used to optimize image-based model observers. We included a total of 102 pathology proven breast computed tomography (CT) cases (62 malignant). An iterative image reconstruction (IIR) algorithm was used to obtain 24 reconstructions with different image appearance for each image. Using quantitative image feature analysis, three IIRs and one clinical reconstruction of 50 lesions (25 malignant) were selected for a reader study. The reconstructions spanned a range of smooth-low noise to sharp-high noise image appearance. The trained classifiers' AUCs on the above reconstructions ranged from 0.61 (for smooth reconstruction) to 0.95 (for sharp reconstruction). Six experienced MQSA radiologists read 200 cases (50 lesions times 4 reconstructions) and provided the likelihood of malignancy of each lesion. Radiologists' diagnostic performances (AUC) ranged from 0.7 to 0.89. However, there was no agreement among the six radiologists on which image appearance was the best, in terms of radiologists' having the highest diagnostic performances. Specifically, two radiologists indicated sharper image appearance was diagnostically superior, another two radiologists indicated smoother image appearance was diagnostically superior, and another two radiologists indicated all image appearances were diagnostically similar to each other. Due to the poor agreement among radiologists on the diagnostic ranking of images, it may not be possible to develop a model observer for this particular imaging task.

  11. The ability of a GCM-forced hydrological model to reproduce global discharge variability

    Directory of Open Access Journals (Sweden)

    F. C. Sperna Weiland

    2010-08-01

    Full Text Available Data from General Circulation Models (GCMs are often used to investigate hydrological impacts of climate change. However GCM data are known to have large biases, especially for precipitation. In this study the usefulness of GCM data for hydrological studies, with focus on discharge variability and extremes, was tested by using bias-corrected daily climate data of the 20CM3 control experiment from a selection of twelve GCMs as input to the global hydrological model PCR-GLOBWB. Results of these runs were compared with discharge observations of the GRDC and discharges calculated from model runs based on two meteorological datasets constructed from the observation-based CRU TS2.1 and ERA-40 reanalysis. In the first dataset the CRU TS 2.1 monthly timeseries were downscaled to daily timeseries using the ERA-40 dataset (ERA6190. This dataset served as a best guess of the past climate and was used to analyze the performance of PCR-GLOBWB. The second dataset was created from the ERA-40 timeseries bias-corrected with the CRU TS 2.1 dataset using the same bias-correction method as applied to the GCM datasets (ERACLM. Through this dataset the influence of the bias-correction method was quantified. The bias-correction was limited to monthly mean values of precipitation, potential evaporation and temperature, as our focus was on the reproduction of inter- and intra-annual variability.

    After bias-correction the spread in discharge results of the GCM based runs decreased and results were similar to results of the ERA-40 based runs, especially for rivers with a strong seasonal pattern. Overall the bias-correction method resulted in a slight reduction of global runoff and the method performed less well in arid and mountainous regions. However, deviations between GCM results and GRDC statistics did decrease for Q, Q90 and IAV. After bias-correction consistency amongst

  12. Validation of the 3D Skin Comet assay using full thickness skin models: transferability and reproducibility

    Directory of Open Access Journals (Sweden)

    Kerstin Reisinger

    2015-06-01

    Full Text Available The 3D Skin Comet assay was developed to improve the in vitro prediction of the genotoxic potential of dermally applied chemicals. For this purpose, a classical read-out for genotoxicity (i.e. comet formation was combined with reconstructed 3D skin models as well-established test systems. Five laboratories (BASF, BfR (Federal Institute for Risk Assessment, Henkel, Procter & Gamble and TNO Triskilion started to validate this assay using the Phenion® Full- Thickness (FT Skin Model and 8 coded chemicals with financial support by Cosmetics Europe and the German Ministry of Education & Research. There was an excellent overall predictivity of the expected genotoxicity (>90%. Four labs correctly identified all chemicals and the fifth correctly identified 80% of the chemicals. Background DNA damage was low and values for solvent (acetone and positive (methyl methanesulfonate (MMS controls were comparable among labs. Inclusion of the DNA-polymerase inhibitor, aphidicolin (APC, in the protocol improved the predictivity of the assay since it enabled robust detection of pro-mutagens e.g., 7,12-dimethylbenz[a]anthracene and benzo[a]pyrene. Therefore, all negative findings are now confirmed by additional APC experiments to come to a final conclusion. Furthermore, MMC, which intercalates between DNA strands causing covalent binding, was detected with the standard protocol, in which it gave weak but statistically significant responses. Stronger responses, however, were obtained using a cross-linker specific protocol in which MMC reduced the migration of MMS-induced DNA damage. These data support the use of the Phenion® FT in the Comet assay: no false-positives and only one false-negative finding in a single lab. Testing will continue to obtain data for 30 chemicals. Once validated, the 3D Skin Comet assay is foreseen to be used as a follow-up test for positive results from the current in vitro genotoxicity test battery.

  13. PAMELA positron and electron spectra are reproduced by 3-dimensional cosmic-ray modeling

    CERN Document Server

    Gaggero, Daniele; Maccione, Luca; Di Bernardo, Giuseppe; Evoli, Carmelo

    2013-01-01

    The PAMELA collaboration recently released the $e^+$ absolute spectrum between 1 and 300 GeV in addition to the positron fraction and $e^-$ spectrum previously measured in the same time period. We use the newly developed 3-dimensional upgrade of the DRAGON code and the charge dependent solar modulation HelioProp code to consistently describe those data. We obtain very good fits of all data sets if a $e^+$ + $e^-$ hard extra-component peaked at 1 TeV is added to a softer $e^-$ background and the secondary $e^\\pm$ produced by the spallation of cosmic ray proton and helium nuclei. All sources are assumed to follow a realistic spiral arm spatial distribution. Remarkably, PAMELA data do not display any need of charge asymmetric extra-component. Finally, plain diffusion, or low re-acceleration, propagation models which are tuned against nuclear data, nicely describe PAMELA lepton data with no need to introduce a low energy break in the proton and Helium spectra.

  14. A Detailed Data-Driven Network Model of Prefrontal Cortex Reproduces Key Features of In Vivo Activity.

    Science.gov (United States)

    Hass, Joachim; Hertäg, Loreen; Durstewitz, Daniel

    2016-05-01

    The prefrontal cortex is centrally involved in a wide range of cognitive functions and their impairment in psychiatric disorders. Yet, the computational principles that govern the dynamics of prefrontal neural networks, and link their physiological, biochemical and anatomical properties to cognitive functions, are not well understood. Computational models can help to bridge the gap between these different levels of description, provided they are sufficiently constrained by experimental data and capable of predicting key properties of the intact cortex. Here, we present a detailed network model of the prefrontal cortex, based on a simple computationally efficient single neuron model (simpAdEx), with all parameters derived from in vitro electrophysiological and anatomical data. Without additional tuning, this model could be shown to quantitatively reproduce a wide range of measures from in vivo electrophysiological recordings, to a degree where simulated and experimentally observed activities were statistically indistinguishable. These measures include spike train statistics, membrane potential fluctuations, local field potentials, and the transmission of transient stimulus information across layers. We further demonstrate that model predictions are robust against moderate changes in key parameters, and that synaptic heterogeneity is a crucial ingredient to the quantitative reproduction of in vivo-like electrophysiological behavior. Thus, we have produced a physiologically highly valid, in a quantitative sense, yet computationally efficient PFC network model, which helped to identify key properties underlying spike time dynamics as observed in vivo, and can be harvested for in-depth investigation of the links between physiology and cognition.

  15. Broad-band colours and overall photometric properties of template galaxy models from stellar population synthesis

    Science.gov (United States)

    Buzzoni, Alberto

    2005-08-01

    the observation (and the interpretation) of high-redshift surveys. In addition to broad-band colours, the modelling of Balmer line emission in disc-dominated systems shows that striking emission lines, like Hα, can very effectively track stellar birth rate in a galaxy. For these features to be useful age tracers as well, however, one should first assess the real change of b versus time on the basis of supplementary (and physically independent) arguments.

  16. The statistics of repeating patterns of cortical activity can be reproduced by a model network of stochastic binary neurons.

    Science.gov (United States)

    Roxin, Alex; Hakim, Vincent; Brunel, Nicolas

    2008-10-15

    Calcium imaging of the spontaneous activity in cortical slices has revealed repeating spatiotemporal patterns of transitions between so-called down states and up states (Ikegaya et al., 2004). Here we fit a model network of stochastic binary neurons to data from these experiments, and in doing so reproduce the distributions of such patterns. We use two versions of this model: (1) an unconnected network in which neurons are activated as independent Poisson processes; and (2) a network with an interaction matrix, estimated from the data, representing effective interactions between the neurons. The unconnected model (model 1) is sufficient to account for the statistics of repeating patterns in 11 of the 15 datasets studied. Model 2, with interactions between neurons, is required to account for pattern statistics of the remaining four. Three of these four datasets are the ones that contain the largest number of transitions, suggesting that long datasets are in general necessary to render interactions statistically visible. We then study the topology of the matrix of interactions estimated for these four datasets. For three of the four datasets, we find sparse matrices with long-tailed degree distributions and an overrepresentation of certain network motifs. The remaining dataset exhibits a strongly interconnected, spatially localized subgroup of neurons. In all cases, we find that interactions between neurons facilitate the generation of long patterns that do not repeat exactly.

  17. Isokinetic eccentric exercise as a model to induce and reproduce pathophysiological alterations related to delayed onset muscle soreness

    DEFF Research Database (Denmark)

    Lund, Henrik; Vestergaard-Poulsen, P; Kanstrup, I.L.

    1998-01-01

    Physiological alterations following unaccustomed eccentric exercise in an isokinetic dynamometer of the right m. quadriceps until exhaustion were studied, in order to create a model in which the physiological responses to physiotherapy could be measured. In experiment I (exp. I), seven selected...... parameters were measured bilaterally in 7 healthy subjects at day 0 as a control value. Then after a standardized bout of eccentric exercise the same parameters were measured daily for the following 7 d (test values). The measured parameters were: the ratio of phosphocreatine to inorganic phosphate (PCr...... (133Xenon washout technique). This was repeated in experiment II (exp. II) 6-12 months later in order to study reproducibility. In experiment III (exp. III), the normal fluctuations over 8 d of the seven parameters were measured, without intervention with eccentric exercise in 6 other subjects. All...

  18. Efficient and Reproducible Myogenic Differentiation from Human iPS Cells: Prospects for Modeling Miyoshi Myopathy In Vitro

    Science.gov (United States)

    Tanaka, Akihito; Woltjen, Knut; Miyake, Katsuya; Hotta, Akitsu; Ikeya, Makoto; Yamamoto, Takuya; Nishino, Tokiko; Shoji, Emi; Sehara-Fujisawa, Atsuko; Manabe, Yasuko; Fujii, Nobuharu; Hanaoka, Kazunori; Era, Takumi; Yamashita, Satoshi; Isobe, Ken-ichi; Kimura, En; Sakurai, Hidetoshi

    2013-01-01

    The establishment of human induced pluripotent stem cells (hiPSCs) has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1) in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70–90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM) is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF). These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs. PMID:23626698

  19. A short-term mouse model that reproduces the immunopathological features of rhinovirus-induced exacerbation of COPD.

    Science.gov (United States)

    Singanayagam, Aran; Glanville, Nicholas; Walton, Ross P; Aniscenko, Julia; Pearson, Rebecca M; Pinkerton, James W; Horvat, Jay C; Hansbro, Philip M; Bartlett, Nathan W; Johnston, Sebastian L

    2015-08-01

    Viral exacerbations of chronic obstructive pulmonary disease (COPD), commonly caused by rhinovirus (RV) infections, are poorly controlled by current therapies. This is due to a lack of understanding of the underlying immunopathological mechanisms. Human studies have identified a number of key immune responses that are associated with RV-induced exacerbations including neutrophilic inflammation, expression of inflammatory cytokines and deficiencies in innate anti-viral interferon. Animal models of COPD exacerbation are required to determine the contribution of these responses to disease pathogenesis. We aimed to develop a short-term mouse model that reproduced the hallmark features of RV-induced exacerbation of COPD. Evaluation of complex protocols involving multiple dose elastase and lipopolysaccharide (LPS) administration combined with RV1B infection showed suppression rather than enhancement of inflammatory parameters compared with control mice infected with RV1B alone. Therefore, these approaches did not accurately model the enhanced inflammation associated with RV infection in patients with COPD compared with healthy subjects. In contrast, a single elastase treatment followed by RV infection led to heightened airway neutrophilic and lymphocytic inflammation, increased expression of tumour necrosis factor (TNF)-α, C-X-C motif chemokine 10 (CXCL10)/IP-10 (interferon γ-induced protein 10) and CCL5 [chemokine (C-C motif) ligand 5]/RANTES (regulated on activation, normal T-cell expressed and secreted), mucus hypersecretion and preliminary evidence for increased airway hyper-responsiveness compared with mice treated with elastase or RV infection alone. In summary, we have developed a new mouse model of RV-induced COPD exacerbation that mimics many of the inflammatory features of human disease. This model, in conjunction with human models of disease, will provide an essential tool for studying disease mechanisms and allow testing of novel therapies with potential to

  20. [NDVI difference rate recognition model of deciduous broad-leaved forest based on HJ-CCD remote sensing data].

    Science.gov (United States)

    Wang, Yan; Tian, Qing-Jiu; Huang, Yan; Wei, Hong-Wei

    2013-04-01

    The present paper takes Chuzhou in Anhui Province as the research area, and deciduous broad-leaved forest as the research object. Then it constructs the recognition model about deciduous broad-leaved forest was constructed using NDVI difference rate between leaf expansion and flowering and fruit-bearing, and the model was applied to HJ-CCD remote sensing image on April 1, 2012 and May 4, 2012. At last, the spatial distribution map of deciduous broad-leaved forest was extracted effectively, and the results of extraction were verified and evaluated. The result shows the validity of NDVI difference rate extraction method proposed in this paper and also verifies the applicability of using HJ-CCD data for vegetation classification and recognition.

  1. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  2. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2017-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  3. Can a coupled meteorology-chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Directory of Open Access Journals (Sweden)

    J. Xing

    2015-05-01

    Full Text Available The ability of a coupled meteorology-chemistry model, i.e., WRF-CMAQ, in reproducing the historical trend in AOD and clear-sky short-wave radiation (SWR over the Northern Hemisphere has been evaluated through a comparison of 21 year simulated results with observation-derived records from 1990–2010. Six satellite retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-terra and -aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both TOA and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling and decreased surface SWR (downwelling in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling and increased surface SWR (downwelling in eastern US, Europe and northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and north Indian Ocean. Estimates of aerosol direct radiative effect (DRE at TOA are comparable with those derived by measurements. Compared to GCMs, the model exhibits better estimates of surface- aerosol direct radiative efficiency (Eτ. However, surface-DRE tends to be underestimated due to the underestimated AOD in land and dust regions. Further investigation of TOA-Eτ estimations as well as the dust module used for estimates of windblown-dust emissions is needed.

  4. A Bloch-McConnell simulator with pharmacokinetic modeling to explore accuracy and reproducibility in the measurement of hyperpolarized pyruvate

    Science.gov (United States)

    Walker, Christopher M.; Bankson, James A.

    2015-03-01

    Magnetic resonance imaging (MRI) of hyperpolarized (HP) agents has the potential to probe in-vivo metabolism with sensitivity and specificity that was not previously possible. Biological conversion of HP agents specifically for cancer has been shown to correlate to presence of disease, stage and response to therapy. For such metabolic biomarkers derived from MRI of hyperpolarized agents to be clinically impactful, they need to be validated and well characterized. However, imaging of HP substrates is distinct from conventional MRI, due to the non-renewable nature of transient HP magnetization. Moreover, due to current practical limitations in generation and evolution of hyperpolarized agents, it is not feasible to fully experimentally characterize measurement and processing strategies. In this work we use a custom Bloch-McConnell simulator with pharmacokinetic modeling to characterize the performance of specific magnetic resonance spectroscopy sequences over a range of biological conditions. We performed numerical simulations to evaluate the effect of sequence parameters over a range of chemical conversion rates. Each simulation was analyzed repeatedly with the addition of noise in order to determine the accuracy and reproducibility of measurements. Results indicate that under both closed and perfused conditions, acquisition parameters can affect measurements in a tissue dependent manner, suggesting that great care needs to be taken when designing studies involving hyperpolarized agents. More modeling studies will be needed to determine what effect sequence parameters have on more advanced acquisitions and processing methods.

  5. An Effective and Reproducible Model of Ventricular Fibrillation in Crossbred Yorkshire Swine (Sus scrofa) for Use in Physiologic Research.

    Science.gov (United States)

    Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C

    2015-10-01

    Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF.

  6. A rat tail temporary static compression model reproduces different stages of intervertebral disc degeneration with decreased notochordal cell phenotype.

    Science.gov (United States)

    Hirata, Hiroaki; Yurube, Takashi; Kakutani, Kenichiro; Maeno, Koichiro; Takada, Toru; Yamamoto, Junya; Kurakawa, Takuto; Akisue, Toshihiro; Kuroda, Ryosuke; Kurosaka, Masahiro; Nishida, Kotaro

    2014-03-01

    The intervertebral disc nucleus pulposus (NP) has two phenotypically distinct cell types-notochordal cells (NCs) and non-notochordal chondrocyte-like cells. In human discs, NCs are lost during adolescence, which is also when discs begin to show degenerative signs. However, little evidence exists regarding the link between NC disappearance and the pathogenesis of disc degeneration. To clarify this, a rat tail disc degeneration model induced by static compression at 1.3 MPa for 0, 1, or 7 days was designed and assessed for up to 56 postoperative days. Radiography, MRI, and histomorphology showed degenerative disc findings in response to the compression period. Immunofluorescence displayed that the number of DAPI-positive NP cells decreased with compression; particularly, the decrease was notable in larger, vacuolated, cytokeratin-8- and galectin-3-co-positive cells, identified as NCs. The proportion of TUNEL-positive cells, which predominantly comprised non-NCs, increased with compression. Quantitative PCR demonstrated isolated mRNA up-regulation of ADAMTS-5 in the 1-day loaded group and MMP-3 in the 7-day loaded group. Aggrecan-1 and collagen type 2α-1 mRNA levels were down-regulated in both groups. This rat tail temporary static compression model, which exhibits decreased NC phenotype, increased apoptotic cell death, and imbalanced catabolic and anabolic gene expression, reproduces different stages of intervertebral disc degeneration.

  7. An update on the rotenone models of Parkinson's disease: their ability to reproduce the features of clinical disease and model gene-environment interactions.

    Science.gov (United States)

    Johnson, Michaela E; Bobrovskaya, Larisa

    2015-01-01

    Parkinson's disease (PD) is the second most common neurodegenerative disorder that is characterized by two major neuropathological hallmarks: the degeneration of dopaminergic neurons in the substantia nigra (SN) and the presence of Lewy bodies in the surviving SN neurons, as well as other regions of the central and peripheral nervous system. Animal models have been invaluable tools for investigating the underlying mechanisms of the pathogenesis of PD and testing new potential symptomatic, neuroprotective and neurorestorative therapies. However, the usefulness of these models is dependent on how precisely they replicate the features of clinical PD with some studies now employing combined gene-environment models to replicate more of the affected pathways. The rotenone model of PD has become of great interest following the seminal paper by the Greenamyre group in 2000 (Betarbet et al., 2000). This paper reported for the first time that systemic rotenone was able to reproduce the two pathological hallmarks of PD as well as certain parkinsonian motor deficits. Since 2000, many research groups have actively used the rotenone model worldwide. This paper will review rotenone models, focusing upon their ability to reproduce the two pathological hallmarks of PD, motor deficits, extranigral pathology and non-motor symptoms. We will also summarize the recent advances in neuroprotective therapies, focusing on those that investigated non-motor symptoms and review rotenone models used in combination with PD genetic models to investigate gene-environment interactions.

  8. QSAR model reproducibility and applicability: a case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles.

    Science.gov (United States)

    Roy, Partha Pratim; Kovarich, Simona; Gramatica, Paola

    2011-08-01

    The crucial importance of the three central OECD principles for quantitative structure-activity relationship (QSAR) model validation is highlighted in a case study of tropospheric degradation of volatile organic compounds (VOCs) by OH, applied to two CADASTER chemical classes (PBDEs and (benzo-)triazoles). The application of any QSAR model to chemicals without experimental data largely depends on model reproducibility by the user. The reproducibility of an unambiguous algorithm (OECD Principle 2) is guaranteed by redeveloping MLR models based on both updated version of DRAGON software for molecular descriptors calculation and some freely available online descriptors. The Genetic Algorithm has confirmed its ability to always select the most informative descriptors independently on the input pool of variables. The ability of the GA-selected descriptors to model chemicals not used in model development is verified by three different splittings (random by response, K-ANN and K-means clustering), thus ensuring the external predictivity of the new models, independently of the training/prediction set composition (OECD Principle 5). The relevance of checking the structural applicability domain becomes very evident on comparing the predictions for CADASTER chemicals, using the new models proposed herein, with those obtained by EPI Suite. Copyright © 2011 Wiley Periodicals, Inc.

  9. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  10. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  11. Enhancement of accuracy and reproducibility of parametric modeling for estimating abnormal intra-QRS potentials in signal-averaged electrocardiograms.

    Science.gov (United States)

    Lin, Chun-Cheng

    2008-09-01

    This work analyzes and attempts to enhance the accuracy and reproducibility of parametric modeling in the discrete cosine transform (DCT) domain for the estimation of abnormal intra-QRS potentials (AIQP) in signal-averaged electrocardiograms. One hundred sets of white noise with a flat frequency response were introduced to simulate the unpredictable, broadband AIQP when quantitatively analyzing estimation error. Further, a high-frequency AIQP parameter was defined to minimize estimation error caused by the overlap between normal QRS and AIQP in low-frequency DCT coefficients. Seventy-two patients from Taiwan were recruited for the study, comprising 30 patients with ventricular tachycardia (VT) and 42 without VT. Analytical results showed that VT patients had a significant decrease in the estimated AIQP. The global diagnostic performance (area under the receiver operating characteristic curve) of AIQP rose from 73.0% to 84.2% in lead Y, and from 58.3% to 79.1% in lead Z, when the high-frequency range fell from 100% to 80%. The combination of AIQP and ventricular late potentials further enhanced performance to 92.9% (specificity=90.5%, sensitivity=90%). Therefore, the significantly reduced AIQP in VT patients, possibly also including dominant unpredictable potentials within the normal QRS complex, may be new promising evidence of ventricular arrhythmias.

  12. Assessment of an ensemble of ocean-atmosphere coupled and uncoupled regional climate models to reproduce the climatology of Mediterranean cyclones

    Science.gov (United States)

    Flaounas, Emmanouil; Kelemen, Fanni Dora; Wernli, Heini; Gaertner, Miguel Angel; Reale, Marco; Sanchez-Gomez, Emilia; Lionello, Piero; Calmanti, Sandro; Podrascanin, Zorica; Somot, Samuel; Akhtar, Naveed; Romera, Raquel; Conte, Dario

    2016-11-01

    This study aims to assess the skill of regional climate models (RCMs) at reproducing the climatology of Mediterranean cyclones. Seven RCMs are considered, five of which were also coupled with an oceanic model. All simulations were forced at the lateral boundaries by the ERA-Interim reanalysis for a common 20-year period (1989-2008). Six different cyclone tracking methods have been applied to all twelve RCM simulations and to the ERA-Interim reanalysis in order to assess the RCMs from the perspective of different cyclone definitions. All RCMs reproduce the main areas of high cyclone occurrence in the region south of the Alps, in the Adriatic, Ionian and Aegean Seas, as well as in the areas close to Cyprus and to Atlas mountains. The RCMs tend to underestimate intense cyclone occurrences over the Mediterranean Sea and reproduce 24-40 % of these systems, as identified in the reanalysis. The use of grid nudging in one of the RCMs is shown to be beneficial, reproducing about 60 % of the intense cyclones and keeping a better track of the seasonal cycle of intense cyclogenesis. Finally, the most intense cyclones tend to be similarly reproduced in coupled and uncoupled model simulations, suggesting that modeling atmosphere-ocean coupled processes has only a weak impact on the climatology and intensity of Mediterranean cyclones.

  13. A computational model for histone mark propagation reproduces the distribution of heterochromatin in different human cell types.

    Science.gov (United States)

    Schwämmle, Veit; Jensen, Ole Nørregaard

    2013-01-01

    Chromatin is a highly compact and dynamic nuclear structure that consists of DNA and associated proteins. The main organizational unit is the nucleosome, which consists of a histone octamer with DNA wrapped around it. Histone proteins are implicated in the regulation of eukaryote genes and they carry numerous reversible post-translational modifications that control DNA-protein interactions and the recruitment of chromatin binding proteins. Heterochromatin, the transcriptionally inactive part of the genome, is densely packed and contains histone H3 that is methylated at Lys 9 (H3K9me). The propagation of H3K9me in nucleosomes along the DNA in chromatin is antagonizing by methylation of H3 Lysine 4 (H3K4me) and acetylations of several lysines, which is related to euchromatin and active genes. We show that the related histone modifications form antagonized domains on a coarse scale. These histone marks are assumed to be initiated within distinct nucleation sites in the DNA and to propagate bi-directionally. We propose a simple computer model that simulates the distribution of heterochromatin in human chromosomes. The simulations are in agreement with previously reported experimental observations from two different human cell lines. We reproduced different types of barriers between heterochromatin and euchromatin providing a unified model for their function. The effect of changes in the nucleation site distribution and of propagation rates were studied. The former occurs mainly with the aim of (de-)activation of single genes or gene groups and the latter has the power of controlling the transcriptional programs of entire chromosomes. Generally, the regulatory program of gene transcription is controlled by the distribution of nucleation sites along the DNA string.

  14. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models gene

  15. Randomised reproducing graphs

    CERN Document Server

    Jordan, Jonathan

    2011-01-01

    We introduce a model for a growing random graph based on simultaneous reproduction of the vertices. The model can be thought of as a generalisation of the reproducing graphs of Southwell and Cannings and Bonato et al to allow for a random element, and there are three parameters, $\\alpha$, $\\beta$ and $\\gamma$, which are the probabilities of edges appearing between different types of vertices. We show that as the probabilities associated with the model vary there are a number of phase transitions, in particular concerning the degree sequence. If $(1+\\alpha)(1+\\gamma)1$ then the degree of a typical vertex grows to infinity, and the proportion of vertices having any fixed degree $d$ tends to zero. We also give some results on the number of edges and on the spectral gap.

  16. Modeling Turkish M2 broad money demand: a portfolio-based approach using implications for monetary policy

    OpenAIRE

    Levent, Korap

    2008-01-01

    In this paper, a money demand model upon M2 broad monetary aggregate for the Turkish economy is examined in a portfolio-based approach considering various alternative cost measures to hold money. Employing multivariate co-integration methodology of the same order integrated variables, our estimation results indicate that there exists a theoretically plausible co-integrating vector in the long-run money demand variable space. The main alternative costs to demand for money are found as the depr...

  17. A methodology for model-based greenhouse design: Part 1, a greenhouse climate model for a broad range of designs and climates

    NARCIS (Netherlands)

    Vanthoor, B.H.E.; Stanghellini, C.; Henten, van E.J.; Visser, de P.H.B.

    2011-01-01

    With the aim of developing a model-based method to design greenhouses for a broad range of climatic and economic conditions, a greenhouse climate model has been developed and validated. This model describes the effects of the outdoor climate and greenhouse design on the indoor greenhouse climate. Fo

  18. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  19. Artificial neural network model to predict slag viscosity over a broad range of temperatures and slag compositions

    Energy Technology Data Exchange (ETDEWEB)

    Duchesne, Marc A. [Chemical and Biological Engineering Department, University of Ottawa, 161 Louis Pasteur, Ottawa, Ont. (Canada); CanmetENERGY, 1 Haanel Drive, Ottawa, Ontario (Canada); Macchi, Arturo [Chemical and Biological Engineering Department, University of Ottawa, 161 Louis Pasteur, Ottawa, Ont. (Canada); Lu, Dennis Y.; Hughes, Robin W.; McCalden, David; Anthony, Edward J. [CanmetENERGY, 1 Haanel Drive, Ottawa, Ontario (Canada)

    2010-08-15

    Threshold slag viscosity heuristics are often used for the initial assessment of coal gasification projects. Slag viscosity predictions are also required for advanced combustion and gasification models. Due to unsatisfactory performance of theoretical equations, an artificial neural network model was developed to predict slag viscosity over a broad range of temperatures and slag compositions. This model outperforms other slag viscosity models, resulting in an average error factor of 5.05 which is lower than the best obtained with other available models. Genesee coal ash viscosity predictions were made to investigate the effect of adding Canadian limestone and dolomite. The results indicate that magnesium in the fluxing agent provides a greater viscosity reduction than calcium for the threshold slag tapping temperature range. (author)

  20. Parcels versus pixels: modeling agricultural land use across broad geographic regions using parcel-based field boundaries

    Science.gov (United States)

    Sohl, Terry L.; Dornbierer, Jordan; Wika, Steve; Sayler, Kristi L.; Quenzer, Robert

    2017-01-01

    Land use and land cover (LULC) change occurs at a local level within contiguous ownership and management units (parcels), yet LULC models primarily use pixel-based spatial frameworks. The few parcel-based models being used overwhelmingly focus on small geographic areas, limiting the ability to assess LULC change impacts at regional to national scales. We developed a modified version of the Forecasting Scenarios of land use change model to project parcel-based agricultural change across a large region in the United States Great Plains. A scenario representing an agricultural biofuel scenario was modeled from 2012 to 2030, using real parcel boundaries based on contiguous ownership and land management units. The resulting LULC projection provides a vastly improved representation of landscape pattern over existing pixel-based models, while simultaneously providing an unprecedented combination of thematic detail and broad geographic extent. The conceptual approach is practical and scalable, with potential use for national-scale projections.

  1. Development of a Three-Dimensional Hand Model Using Three-Dimensional Stereophotogrammetry: Assessment of Image Reproducibility.

    Directory of Open Access Journals (Sweden)

    Inge A Hoevenaren

    Full Text Available Using three-dimensional (3D stereophotogrammetry precise images and reconstructions of the human body can be produced. Over the last few years, this technique is mainly being developed in the field of maxillofacial reconstructive surgery, creating fusion images with computed tomography (CT data for precise planning and prediction of treatment outcome. Though, in hand surgery 3D stereophotogrammetry is not yet being used in clinical settings.A total of 34 three-dimensional hand photographs were analyzed to investigate the reproducibility. For every individual, 3D photographs were captured at two different time points (baseline T0 and one week later T1. Using two different registration methods, the reproducibility of the methods was analyzed. Furthermore, the differences between 3D photos of men and women were compared in a distance map as a first clinical pilot testing our registration method.The absolute mean registration error for the complete hand was 1.46 mm. This reduced to an error of 0.56 mm isolating the region to the palm of the hand. When comparing hands of both sexes, it was seen that the male hand was larger (broader base and longer fingers than the female hand.This study shows that 3D stereophotogrammetry can produce reproducible images of the hand without harmful side effects for the patient, so proving to be a reliable method for soft tissue analysis. Its potential use in everyday practice of hand surgery needs to be further explored.

  2. Block and parallel modelling of broad domain nonlinear continuous mapping based on NN

    Institute of Scientific and Technical Information of China (English)

    Yang Guowei; Tu Xuyan; Wang Shoujue

    2006-01-01

    The necessity of the use of the block and parallel modeling of the nonlinear continuous mappings with NN is firstly expounded quantitatively. Then, a practical approach for the block and parallel modeling of the nonlinear continuous mappings with NN is proposed. Finally, an example indicating that the method raised in this paper can be realized by suitable existed software is given. The results of the experiment of the model discussed on the 3-D Mexican straw hat indicate that the block and parallel modeling based on NN is more precise and faster in computation than the direct ones and it is obviously a concrete example and the development of the large-scale general model established by Tu Xuyan.

  3. Broad-band modelling of short gamma-ray bursts with energy injection from magnetar spin-down and its implications for radio detectability

    NARCIS (Netherlands)

    B.P. Gompertz; A.J. van der Horst; P.T. O'Brien; G.A. Wynn; K. Wiersema

    2015-01-01

    The magnetar model has been proposed to explain the apparent energy injection in the X-ray light curves of short gamma-ray bursts (SGRBs), but its implications across the full broad-band spectrum are not well explored. We investigate the broad-band modelling of four SGRBs with evidence for energy in

  4. Broad-band colors and overall photometric properties of template galaxy models from stellar population synthesis

    OpenAIRE

    Buzzoni, Alberto

    2005-01-01

    We present here a new set of evolutionary population synthesis models for template galaxies along the Hubble morphological sequence. The models, that account for the individual evolution of the bulge, disk, and halo components, provide basic morphological features, along with bolometric luminosity and color evolution (including Johnson/Cousins "UBVRcIcJHK", Gunn "gri", and Washington "CMT1T2" photometric systems) between 1 and 15 Gyr. Luminosity contribution from residual gas is also evaluate...

  5. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics.

    Science.gov (United States)

    González-Beltrán, Alejandra; Li, Peter; Zhao, Jun; Avila-Garcia, Maria Susana; Roos, Marco; Thompson, Mark; van der Horst, Eelke; Kaliyaperumal, Rajaram; Luo, Ruibang; Lee, Tin-Lap; Lam, Tak-Wah; Edmunds, Scott C; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2015-01-01

    Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA), Nanopublications (NP), and Research Objects (RO) models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler. Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata. SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http://isa-tools.github.io/soapdenovo2/. philippe

  6. Assessing the status and trend of bat populations across broad geographic regions with dynamic distribution models

    Science.gov (United States)

    Rodhouse, Thomas J.; Ormsbee, Patricia C.; Irvine, Kathryn M.; Vierling, Lee A.; Szewczak, Joseph M.; Vierling, Kerri T.

    2012-01-01

    Bats face unprecedented threats from habitat loss, climate change, disease, and wind power development, and populations of many species are in decline. A better ability to quantify bat population status and trend is urgently needed in order to develop effective conservation strategies. We used a Bayesian autoregressive approach to develop dynamic distribution models for Myotis lucifugus, the little brown bat, across a large portion of northwestern USA, using a four-year detection history matrix obtained from a regional monitoring program. This widespread and abundant species has experienced precipitous local population declines in northeastern USA resulting from the novel disease white-nose syndrome, and is facing likely range-wide declines. Our models were temporally dynamic and accounted for imperfect detection. Drawing on species–energy theory, we included measures of net primary productivity (NPP) and forest cover in models, predicting that M. lucifugus occurrence probabilities would covary positively along those gradients.

  7. Broad-band colors and overall photometric properties of template galaxy models from stellar population synthesis

    CERN Document Server

    Buzzoni, A

    2005-01-01

    We present here a new set of evolutionary population synthesis models for template galaxies along the Hubble morphological sequence. The models, that account for the individual evolution of the bulge, disk, and halo components, provide basic morphological features, along with bolometric luminosity and color evolution (including Johnson/Cousins "UBVRcIcJHK", Gunn "gri", and Washington "CMT1T2" photometric systems) between 1 and 15 Gyr. Luminosity contribution from residual gas is also evaluated, both in terms of nebular continuum and Balmer-line enhancement.

  8. Optimized Variational 1D Boussinesq Modelling for broad-band waves over flat bottom

    NARCIS (Netherlands)

    Lakhturov, I.; Adytia, D.; Groesen, van E.

    2012-01-01

    The Variational Boussinesq Model (VBM) for waves above a layer of ideal fluid conserves mass, momentum, energy, and has decreased dimensionality compared to the full problem. It is derived from the Hamiltonian formulation via an approximation of the kinetic energy, and can provide approximate disper

  9. Optimized variational Boussinesq modelling; part 1: Broad-band waves over flat bottom

    NARCIS (Netherlands)

    Lakhturov, I.; Groesen, van E.

    2010-01-01

    The Variational Boussinesq Model (VBM) for waves above a layer of ideal fluid conserves mass, momentum, energy, and has decreased dimensionality compared to the full problem. It is derived from the Hamiltonian formulation via an approximation of the kinetic energy, and can provide approximate disper

  10. Models for Broad Area Event Identification and Yield Estimation: Multiple Coda Types

    Science.gov (United States)

    2011-09-01

    microearthquakes accompanying hydraulic fracturing in granitic rock, Bull. Seism . Soc. Am., 81, 553-575, 1991. Fisk, M. and S. R. Taylor, (2007...146882, pp. 13. Yang, X., T. Lay, X.-B. Xie, and M. S. Thorne (2007). Geometric spreading of Pn and Sn in a spherical Earth model, Bull. Seism . Soc

  11. Optimized Variational 1D Boussinesq Modelling for broad-band waves over flat bottom

    NARCIS (Netherlands)

    Lakhturov, I.; Adytia, D.; van Groesen, Embrecht W.C.

    The Variational Boussinesq Model (VBM) for waves above a layer of ideal fluid conserves mass, momentum, energy, and has decreased dimensionality compared to the full problem. It is derived from the Hamiltonian formulation via an approximation of the kinetic energy, and can provide approximate

  12. Current models broadly neglect specific needs of biodiversity conservation in protected areas under climate change

    Directory of Open Access Journals (Sweden)

    Moloney Kirk A

    2011-05-01

    Full Text Available Abstract Background Protected areas are the most common and important instrument for the conservation of biological diversity and are called for under the United Nations' Convention on Biological Diversity. Growing human population densities, intensified land-use, invasive species and increasing habitat fragmentation threaten ecosystems worldwide and protected areas are often the only refuge for endangered species. Climate change is posing an additional threat that may also impact ecosystems currently under protection. Therefore, it is of crucial importance to include the potential impact of climate change when designing future nature conservation strategies and implementing protected area management. This approach would go beyond reactive crisis management and, by necessity, would include anticipatory risk assessments. One avenue for doing so is being provided by simulation models that take advantage of the increase in computing capacity and performance that has occurred over the last two decades. Here we review the literature to determine the state-of-the-art in modeling terrestrial protected areas under climate change, with the aim of evaluating and detecting trends and gaps in the current approaches being employed, as well as to provide a useful overview and guidelines for future research. Results Most studies apply statistical, bioclimatic envelope models and focus primarily on plant species as compared to other taxa. Very few studies utilize a mechanistic, process-based approach and none examine biotic interactions like predation and competition. Important factors like land-use, habitat fragmentation, invasion and dispersal are rarely incorporated, restricting the informative value of the resulting predictions considerably. Conclusion The general impression that emerges is that biodiversity conservation in protected areas could benefit from the application of modern modeling approaches to a greater extent than is currently reflected in the

  13. Beam-based model of broad-band impedance of the Diamond Light Source

    Science.gov (United States)

    Smaluk, Victor; Martin, Ian; Fielder, Richard; Bartolini, Riccardo

    2015-06-01

    In an electron storage ring, the interaction between a single-bunch beam and a vacuum chamber impedance affects the beam parameters, which can be measured rather precisely. So we can develop beam-based numerical models of longitudinal and transverse impedances. At the Diamond Light Source (DLS) to get the model parameters, a set of measured data has been used including current-dependent shift of betatron tunes and synchronous phase, chromatic damping rates, and bunch lengthening. A matlab code for multiparticle tracking has been developed. The tracking results and analytical estimations are quite consistent with the measured data. Since Diamond has the shortest natural bunch length among all light sources in standard operation, the studies of collective effects with short bunches are relevant to many facilities including next generation of light sources.

  14. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  15. Diagnostic Power of Broad Emission Line Profiles in Searches for Binary Supermassive Black Holes: Comparison of Models with Observations

    Science.gov (United States)

    Nguyen, Khai; Bogdanovic, Tamara; Eracleous, Michael; Runnoe, Jessie C.; Sigurdsson, Steinn

    2017-01-01

    Motivated by observational searches for sub-parsec supermassive black hole binaries (SBHBs) we develop a semi-analytic model to describe the spectral emission line signatures of these systems. We are particularly interested in modeling the profiles of the broad emission lines, which have been used as a tool to search for SBHBs. The goal of this work is to test one of the leading models of binary accretion flows in the literature: SBHB in a circumbinary disk. In this context, we model SBHB accretion flows as a set of three accretion disks: two mini-disks that are gravitationally bound to the individual black holes and a circumbinary disk that forms a common envelope about a gravitationally bound binary. Our first generation model shows that emission line profiles tend to have different statistical properties depending on the semi-major axis, mass ratio, eccentricity of the binary, and the alignment of the triple-disk system, and can in principle be used to constrain the statistical distribution of these parameters. We present the results of a second generation model, which improves upon the treatment of radiative transfer by taking into account the effect of line-driven winds on the properties of the model emission line profiles. This improvement allows a preliminary comparison of the model profiles with the observed SBHB candidates and AGN population in general.

  16. Multi-epitope Models Explain How Pre-existing Antibodies Affect the Generation of Broadly Protective Responses to Influenza.

    Directory of Open Access Journals (Sweden)

    Veronika I Zarnitsyna

    2016-06-01

    Full Text Available The development of next-generation influenza vaccines that elicit strain-transcendent immunity against both seasonal and pandemic viruses is a key public health goal. Targeting the evolutionarily conserved epitopes on the stem of influenza's major surface molecule, hemagglutinin, is an appealing prospect, and novel vaccine formulations show promising results in animal model systems. However, studies in humans indicate that natural infection and vaccination result in limited boosting of antibodies to the stem of HA, and the level of stem-specific antibody elicited is insufficient to provide broad strain-transcendent immunity. Here, we use mathematical models of the humoral immune response to explore how pre-existing immunity affects the ability of vaccines to boost antibodies to the head and stem of HA in humans, and, in particular, how it leads to the apparent lack of boosting of broadly cross-reactive antibodies to the stem epitopes. We consider hypotheses where binding of antibody to an epitope: (i results in more rapid clearance of the antigen; (ii leads to the formation of antigen-antibody complexes which inhibit B cell activation through Fcγ receptor-mediated mechanism; and (iii masks the epitope and prevents the stimulation and proliferation of specific B cells. We find that only epitope masking but not the former two mechanisms to be key in recapitulating patterns in data. We discuss the ramifications of our findings for the development of vaccines against both seasonal and pandemic influenza.

  17. A right to reproduce?

    Science.gov (United States)

    Quigley, Muireann

    2010-10-01

    How should we conceive of a right to reproduce? And, morally speaking, what might be said to justify such a right? These are just two questions of interest that are raised by the technologies of assisted reproduction. This paper analyses the possible legitimate grounds for a right to reproduce within the two main theories of rights; interest theory and choice theory.

  18. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  19. Long-term stability, reproducibility, and statistical sensitivity of a telemetry-instrumented dog model: A 27-month longitudinal assessment.

    Science.gov (United States)

    Fryer, Ryan M; Ng, Khing Jow; Chi, Liguo; Jin, Xidong; Reinhart, Glenn A

    2015-01-01

    ICH guidelines, as well as best-practice and ethical considerations, provide strong rationale for use of telemetry-instrumented dog colonies for cardiovascular safety assessment. However, few studies have investigated the long-term stability of cardiovascular function at baseline, reproducibility in response to pharmacologic challenge, and maintenance of statistical sensitivity to define the usable life of the colony. These questions were addressed in 3 identical studies spanning 27months and were performed in the same colony of dogs. Telemetry-instrumented dogs (n=4) received a single dose of dl-sotalol (10mg/kg, p.o.), a β1 adrenergic and IKr blocker, or vehicle, in 3 separate studies spanning 27months. Systemic hemodynamics, cardiovascular function, and ECG parameters were monitored for 18h post-dose; plasma drug concentrations (Cp) were measured at 1, 3, 5, and 24h post-dose. Baseline hemodynamic/ECG values were consistent across the 27-month study with the exception of modest age-dependent decreases in heart rate and the corresponding QT-interval. dl-Sotalol elicited highly reproducible effects in each study. Reductions in heart rate after dl-sotalol treatment ranged between -22 and -32 beats/min, and slight differences in magnitude could be ascribed to variability in dl-sotalol Cp (range=3230-5087ng/mL); dl-sotalol also reduced LV-dP/dtmax 13-22%. dl-Sotalol increased the slope of the PR-RR relationship suggesting inhibition of AV-conduction. Increases in the heart-rate corrected QT-interval were not significantly different across the 3 studies and results of a power analysis demonstrated that the detection limit for QTc values was not diminished throughout the 27month period and across a range of power assumptions despite modest, age-dependent changes in heart rate. These results demonstrate the long-term stability of a telemetry dog colony as evidenced by a stability of baseline values, consistently reproducible response to pharmacologic challenge and no

  20. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    and side-to-side asymmetry. Data were analysed according to the Obrist model and the results compared with those obtained using a model correcting for the air passage artifact. Reproducibility was of the same order of magnitude as reported using stationary equipment. The side-to-side CBF asymmetry...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible....

  1. Molecular modeling assisted hapten design to produce broad selectivity antibodies for fluoroquinolone antibiotics.

    Science.gov (United States)

    Pinacho, Daniel G; Sánchez-Baeza, Francisco; Marco, M-Pilar

    2012-05-15

    Antibodies with a wide recognition profile of fluoroquinolone antibiotics have been produced based on chemical criteria, theoretical studies, and molecular modeling assisted hapten design. The immunizing hapten preserves the most important and characteristic epitopes of this antibiotic family. The studies have taken into consideration the zwitterionic character of most of the fluoroquinolones and the relative concentration of the different species in equilibrium at physiologic pH. The hapten is prepared in the form of a stable prehapten through a 5 step synthetic pathway. Immediately before conjugation, the immunizing hapten is obtained by removing the diphenylmethane protecting group. The specificity of the antibodies obtained is directed toward the common area defined by the fluorine atom at position 6 and the β-ketoacid moiety. The ELISA developed is able to recognize with very good detectability important fluoroquinolones used in the veterinary field such as ciprofloxacin (CPFX, IC(50), 0.35 μg L(-1)), enrofloxacin (ERFX, IC(50), 0.65 μg L(-1)), danofloxacin (DNFX, IC(50), 7.31 μg L(-1)), difloxacin (DFX, IC(50), 0.91 μg L(-1)), sarafloxacin (SRFX, IC(50), 0.96 μg L(-1)), norfloxacin (NRFX, IC(50), 0.78 μg L(-1)), ofloxacin (OFX, IC(50), 1.84 μg L(-1)), flumequine (Flume, IC(50), 3.91 μ gL(-1)), marbofloxacin (MBFX, IC(50), 4.30 μ gL(-1)), and oxolinic acid (OXO, IC(50), 23.53 μg L(-1)). The results presented here demonstrate that the antibody affinity is strongly affected by the presence of divalent cations, owing to their complexation with the fluoroquinolone molecules. Moreover, the outcome from the effect of the pH on the immunochemical assays suggests that the selectivity could be modulated with the pH due to the zwitterionic character of the fluoroquinolones and as a function of their different pK(a) values.

  2. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei

    2016-04-01

    The ability to reproduce published scientific findings is a foundational principle of scientific research. Independent observation helps to verify the legitimacy of individual findings; build upon sound observations so that we can evolve hypotheses (and models) of how catchments function; and move them from specific circumstances to more general theory. The rise of computational research has brought increased focus on the issue of reproducibility across the broader scientific literature. This is because publications based on computational research typically do not contain sufficient information to enable the results to be reproduced, and therefore verified. Given the rise of computational analysis in hydrology over the past 30 years, to what extent is reproducibility, or a lack thereof, a problem in hydrology? Whilst much hydrological code is accessible, the actual code and workflow that produced and therefore documents the provenance of published scientific findings, is rarely available. We argue that in order to advance and make more robust the process of hypothesis testing and knowledge creation within the computational hydrological community, we need to build on from existing open data initiatives and adopt common standards and infrastructures to: first make code re-useable and easy to find through consistent use of metadata; second, publish well documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; finally, use unique persistent identifiers (e.g. DOIs) to reference re-useable and reproducible code, thereby clearly showing the provenance of published scientific findings. Whilst extra effort is require to make work reproducible, there are benefits to both the individual and the broader community in doing so, which will improve the credibility of the science in the face of the need for societies to adapt to changing hydrological environments.

  3. Reproducible Research in Speech Sciences

    Directory of Open Access Journals (Sweden)

    Kandaacute;lmandaacute;n Abari

    2012-11-01

    Full Text Available Reproducible research is the minimum standard of scientific claims in cases when independent replication proves to be difficult. With the special combination of available software tools, we provide a reproducibility recipe for the experimental research conducted in some fields of speech sciences. We have based our model on the triad of the R environment, the EMU-format speech database, and the executable publication. We present the use of three typesetting systems (LaTeX, Markdown, Org, with the help of a mini research.

  4. Reproducibility in Seismic Imaging

    Directory of Open Access Journals (Sweden)

    González-Verdejo O.

    2012-04-01

    Full Text Available Within the field of exploration seismology, there is interest at national level of integrating reproducibility in applied, educational and research activities related to seismic processing and imaging. This reproducibility implies the description and organization of the elements involved in numerical experiments. Thus, a researcher, teacher or student can study, verify, repeat, and modify them independently. In this work, we document and adapt reproducibility in seismic processing and imaging to spread this concept and its benefits, and to encourage the use of open source software in this area within our academic and professional environment. We present an enhanced seismic imaging example, of interest in both academic and professional environments, using Mexican seismic data. As a result of this research, we prove that it is possible to assimilate, adapt and transfer technology at low cost, using open source software and following a reproducible research scheme.

  5. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Cocaine addiction related reproducible brain regions of abnormal default-mode network functional connectivity: a group ICA study with different model orders.

    Science.gov (United States)

    Ding, Xiaoyu; Lee, Seong-Whan

    2013-08-26

    Model order selection in group independent component analysis (ICA) has a significant effect on the obtained components. This study investigated the reproducible brain regions of abnormal default-mode network (DMN) functional connectivity related with cocaine addiction through different model order settings in group ICA. Resting-state fMRI data from 24 cocaine addicts and 24 healthy controls were temporally concatenated and processed by group ICA using model orders of 10, 20, 30, 40, and 50, respectively. For each model order, the group ICA approach was repeated 100 times using the ICASSO toolbox and after clustering the obtained components, centrotype-based anterior and posterior DMN components were selected for further analysis. Individual DMN components were obtained through back-reconstruction and converted to z-score maps. A whole brain mixed effects factorial ANOVA was performed to explore the differences in resting-state DMN functional connectivity between cocaine addicts and healthy controls. The hippocampus, which showed decreased functional connectivity in cocaine addicts for all the tested model orders, might be considered as a reproducible abnormal region in DMN associated with cocaine addiction. This finding suggests that using group ICA to examine the functional connectivity of the hippocampus in the resting-state DMN may provide an additional insight potentially relevant for cocaine-related diagnoses and treatments. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Spatial-temporal reproducibility assessment of global seasonal forecasting system version 5 model for Dam Inflow forecasting

    Science.gov (United States)

    Moon, S.; Suh, A. S.; Soohee, H.

    2016-12-01

    The GloSea5(Global Seasonal forecasting system version 5) is provided and operated by the KMA(Korea Meteorological Administration). GloSea5 provides Forecast(FCST) and Hindcast(HCST) data and its horizontal resolution is about 60km (0.83° x 0.56°) in the mid-latitudes. In order to use this data in watershed-scale water management, GloSea5 needs spatial-temporal downscaling. As such, statistical downscaling was used to correct for systematic biases of variables and to improve data reliability. HCST data is provided in ensemble format, and the highest statistical correlation(R2 = 0.60, RMSE = 88.92, NSE = 0.57) of ensemble precipitation was reported for the Yongdam Dam watershed on the #6 grid. Additionally, the original GloSea5(600.1mm) showed the greatest difference(-26.5%) compared to observations(816.1mm) during the summer flood season. However, downscaled GloSea5 was shown to have only a ?3.1% error rate. Most of the underestimated results corresponded to precipitation levels during the flood season and the downscaled GloSea5 showed important results of restoration in precipitation levels. Per the analysis results of spatial autocorrelation using seasonal Moran's I, the spatial distribution was shown to be statistically significant. These results can improve the uncertainty of original GloSea5 and substantiate its spatial-temporal accuracy and validity. The spatial-temporal reproducibility assessment will play a very important role as basic data for watershed-scale water management.

  8. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  9. Whole-body skeletal imaging in mice utilizing microPET: optimization of reproducibility and applications in animal models of bone disease

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Frank [The Crump Institute for Molecular Imaging, Department of Molecular and Medical Pharmacology, University of California School of Medicine, 700 Westwood Blvd., Los Angeles, CA 90095 (United States); Department of Nuclear Medicine, Ludwig-Maximilians-University, Munich (Germany); Lee, Yu-Po; Lieberman, Jay R. [Department of Orthopedic Surgery, University of California School of Medicine, Los Angeles, California (United States); Loening, Andreas M.; Chatziioannou, Arion [The Crump Institute for Molecular Imaging, Department of Molecular and Medical Pharmacology, University of California School of Medicine, 700 Westwood Blvd., Los Angeles, CA 90095 (United States); Freedland, Stephen J.; Belldegrun, Arie S. [Department of Urology, University of California School of Medicine, Los Angeles, California (United States); Leahy, Richard [University of Southern California School of Bioengineering, Los Angeles, California (United States); Sawyers, Charles L. [Department of Medicine, University of California School of Medicine, Los Angeles, California (United States); Gambhir, Sanjiv S. [The Crump Institute for Molecular Imaging, Department of Molecular and Medical Pharmacology, University of California School of Medicine, 700 Westwood Blvd., Los Angeles, CA 90095 (United States); UCLA-Jonsson Comprehensive Cancer Center and Department of Biomathematics, University of California School of Medicine, Los Angeles, California (United States)

    2002-09-01

    The aims were to optimize reproducibility and establish [{sup 18}F]fluoride ion bone scanning in mice, using a dedicated small animal positron emission tomography (PET) scanner (microPET) and to correlate functional findings with anatomical imaging using computed tomography (microCAT). Optimal tracer uptake time for [{sup 18}F]fluoride ion was determined by performing dynamic microPET scans. Quantitative reproducibility was measured using region of interest (ROI)-based counts normalized to (a) the injected dose, (b) integral of the heart time-activity curve, or (c) ROI over the whole skeleton. Bone lesions were repetitively imaged. Functional images were correlated with X-ray and microCAT. The plateau of [{sup 18}F]fluoride uptake occurs 60 min after injection. The highest reproducibility was achieved by normalizing to an ROI over the whole skeleton, with a mean percent coefficient of variation [(SD/mean) x 100] of <15%-20%. Benign and malignant bone lesions were successfully repetitively imaged. Preliminary correlation of microPET with microCAT demonstrated the high sensitivity of microPET and the ability of microCAT to detect small osteolytic lesions. Whole-body [{sup 18}F]fluoride ion bone imaging using microPET is reproducible and can be used to serially monitor normal and pathological changes to the mouse skeleton. Morphological imaging with microCAT is useful to display correlative changes in anatomy. Detailed in vivo studies of the murine skeleton in various small animal models of bone diseases should now be possible. (orig.)

  10. Failed Radiatively Accelerated Dusty Outflow Model of the Broad Line Region in Active Galactic Nuclei. I. Analytical Solution

    Science.gov (United States)

    Czerny, B.; Li, Yan-Rong; Hryniewicz, K.; Panda, S.; Wildy, C.; Sniegowska, M.; Wang, J.-M.; Sredzinska, J.; Karas, V.

    2017-09-01

    The physical origin of the broad line region in active galactic nuclei is still unclear despite many years of observational studies. The reason is that the region is unresolved, and the reverberation mapping results imply a complex velocity field. We adopt a theory-motivated approach to identify the principal mechanism responsible for this complex phenomenon. We consider the possibility that the role of dust is essential. We assume that the local radiation pressure acting on the dust in the accretion disk atmosphere launches the outflow of material, but higher above the disk the irradiation from the central parts causes dust evaporation and a subsequent fallback. This failed radiatively accelerated dusty outflow is expected to represent the material forming low ionization lines. In this paper we formulate simple analytical equations to describe the cloud motion, including the evaporation phase. The model is fully described just by the basic parameters of black hole mass, accretion rate, black hole spin, and viewing angle. We study how the spectral line generic profiles correspond to this dynamic. We show that the virial factor calculated from our model strongly depends on the black hole mass in the case of enhanced dust opacity, and thus it then correlates with the line width. This could explain why the virial factor measured in galaxies with pseudobulges differs from that obtained from objects with classical bulges, although the trend predicted by the current version of the model is opposite to the observed trend.

  11. Application of a process-based shallow landslide hazard model over a broad area in Central Italy

    Science.gov (United States)

    Gioia, Eleonora; Speranza, Gabriella; Ferretti, Maurizio; Godt, Jonathan W.; Baum, Rex L.; Marincioni, Fausto

    2015-01-01

    Process-based models are widely used for rainfall-induced shallow landslide forecasting. Previous studies have successfully applied the U.S. Geological Survey’s Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability (TRIGRS) model (Baum et al. 2002) to compute infiltration-driven changes in the hillslopes’ factor of safety on small scales (i.e., tens of square kilometers). Soil data input for such models are difficult to obtain across larger regions. This work describes a novel methodology for the application of TRIGRS over broad areas with relatively uniform hydrogeological properties. The study area is a 550-km2 region in Central Italy covered by post-orogenic Quaternary sediments. Due to the lack of field data, we assigned mechanical and hydrological property values through a statistical analysis based on literature review of soils matching the local lithologies. We calibrated the model using rainfall data from 25 historical rainfall events that triggered landslides. We compared the variation of pressure head and factor of safety with the landslide occurrence to identify the best fitting input conditions. Using calibrated inputs and a soil depth model, we ran TRIGRS for the study area. Receiver operating characteristic (ROC) analysis, comparing the model’s output with a shallow landslide inventory, shows that TRIGRS effectively simulated the instability conditions in the post-orogenic complex during historical rainfall scenarios. The implication of this work is that rainfall-induced landslides over large regions may be predicted by a deterministic model, even where data on geotechnical and hydraulic properties as well as temporal changes in topography or subsurface conditions are not available.

  12. Can four-zero-texture mass matrix model reproduce the quark and lepton mixing angles and CP violating phases?

    CERN Document Server

    Matsuda, K; Matsuda, Koichi; Nishiura, Hiroyuki

    2006-01-01

    We reconsider an universal mass matrix model which has a seesaw-invariant structure with four-zero texture common to all quarks and leptons. The CKM quark and MNS lepton mixing matrices of the model are analyzed analytically.We show that the model can be consistent with all the experimental data of neutrino oscillation and quark mixings by tuning free parameters of the model. It is also shown that the model predicts a relatively large value for (1,3) element of the MNS lepton mixing matrix, |(U_{MNS})_{13}|^2 \\simeq 2.6 \\times 10^{-2}. Using the seesaw mechanism, we also discuss the conditions for the components of the Dirac and the right-handed Majorana neutrino mass matrices which lead to the neutrino mass matrix consistent with the experimental data.

  13. Modeling neutralization kinetics of HIV by broadly neutralizing monoclonal antibodies in genital secretions coating the cervicovaginal mucosa.

    Directory of Open Access Journals (Sweden)

    Scott A McKinley

    Full Text Available Eliciting broadly neutralizing antibodies (bnAb in cervicovaginal mucus (CVM represents a promising "first line of defense" strategy to reduce vaginal HIV transmission. However, it remains unclear what levels of bnAb must be present in CVM to effectively reduce infection. We approached this complex question by modeling the dynamic tally of bnAb coverage on HIV. This analysis introduces a critical, timescale-dependent competition: to protect, bnAb must accumulate at sufficient stoichiometry to neutralize HIV faster than virions penetrate CVM and reach target cells. We developed a model that incorporates concentrations and diffusivities of HIV and bnAb in semen and CVM, kinetic rates for binding (kon and unbinding (koff of select bnAb, and physiologically relevant thicknesses of CVM and semen layers. Comprehensive model simulations lead to robust conclusions about neutralization kinetics in CVM. First, due to the limited time virions in semen need to penetrate CVM, substantially greater bnAb concentrations than in vitro estimates must be present in CVM to neutralize HIV. Second, the model predicts that bnAb with more rapid kon, almost independent of koff, should offer greater neutralization potency in vivo. These findings suggest the fastest arriving virions at target cells present the greatest likelihood of infection. It also implies the marked improvements in in vitro neutralization potency of many recently discovered bnAb may not translate to comparable reduction in the bnAb dose needed to confer protection against initial vaginal infections. Our modeling framework offers a valuable tool to gaining quantitative insights into the dynamics of mucosal immunity against HIV and other infectious diseases.

  14. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    Science.gov (United States)

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  15. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    Science.gov (United States)

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  16. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  17. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river.

    Science.gov (United States)

    Williams, R D; Measures, R; Hicks, D M; Brasington, J

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  18. Modeling vegetation heights from high resolution stereo aerial photography: an application for broad-scale rangeland monitoring.

    Science.gov (United States)

    Gillan, Jeffrey K; Karl, Jason W; Duniway, Michael; Elaksher, Ahmed

    2014-11-01

    Vertical vegetation structure in rangeland ecosystems can be a valuable indicator for assessing rangeland health and monitoring riparian areas, post-fire recovery, available forage for livestock, and wildlife habitat. Federal land management agencies are directed to monitor and manage rangelands at landscapes scales, but traditional field methods for measuring vegetation heights are often too costly and time consuming to apply at these broad scales. Most emerging remote sensing techniques capable of measuring surface and vegetation height (e.g., LiDAR or synthetic aperture radar) are often too expensive, and require specialized sensors. An alternative remote sensing approach that is potentially more practical for managers is to measure vegetation heights from digital stereo aerial photographs. As aerial photography is already commonly used for rangeland monitoring, acquiring it in stereo enables three-dimensional modeling and estimation of vegetation height. The purpose of this study was to test the feasibility and accuracy of estimating shrub heights from high-resolution (HR, 3-cm ground sampling distance) digital stereo-pair aerial images. Overlapping HR imagery was taken in March 2009 near Lake Mead, Nevada and 5-cm resolution digital surface models (DSMs) were created by photogrammetric methods (aerial triangulation, digital image matching) for twenty-six test plots. We compared the heights of individual shrubs and plot averages derived from the DSMs to field measurements. We found strong positive correlations between field and image measurements for several metrics. Individual shrub heights tended to be underestimated in the imagery, however, accuracy was higher for dense, compact shrubs compared with shrubs with thin branches. Plot averages of shrub height from DSMs were also strongly correlated to field measurements but consistently underestimated. Grasses and forbs were generally too small to be detected with the resolution of the DSMs. Estimates of

  19. Modeling vegetation heights from high resolution stereo aerial photography: an application for broad-scale rangeland monitoring

    Science.gov (United States)

    Gillan, Jeffrey K.; Karl, Jason W.; Duniway, Michael; Elaksher, Ahmed

    2014-01-01

    Vertical vegetation structure in rangeland ecosystems can be a valuable indicator for assessing rangeland health and monitoring riparian areas, post-fire recovery, available forage for livestock, and wildlife habitat. Federal land management agencies are directed to monitor and manage rangelands at landscapes scales, but traditional field methods for measuring vegetation heights are often too costly and time consuming to apply at these broad scales. Most emerging remote sensing techniques capable of measuring surface and vegetation height (e.g., LiDAR or synthetic aperture radar) are often too expensive, and require specialized sensors. An alternative remote sensing approach that is potentially more practical for managers is to measure vegetation heights from digital stereo aerial photographs. As aerial photography is already commonly used for rangeland monitoring, acquiring it in stereo enables three-dimensional modeling and estimation of vegetation height. The purpose of this study was to test the feasibility and accuracy of estimating shrub heights from high-resolution (HR, 3-cm ground sampling distance) digital stereo-pair aerial images. Overlapping HR imagery was taken in March 2009 near Lake Mead, Nevada and 5-cm resolution digital surface models (DSMs) were created by photogrammetric methods (aerial triangulation, digital image matching) for twenty-six test plots. We compared the heights of individual shrubs and plot averages derived from the DSMs to field measurements. We found strong positive correlations between field and image measurements for several metrics. Individual shrub heights tended to be underestimated in the imagery, however, accuracy was higher for dense, compact shrubs compared with shrubs with thin branches. Plot averages of shrub height from DSMs were also strongly correlated to field measurements but consistently underestimated. Grasses and forbs were generally too small to be detected with the resolution of the DSMs. Estimates of

  20. Attempting to train a digital human model to reproduce human subject reach capabilities in an ejection seat aircraft

    NARCIS (Netherlands)

    Zehner, G.F.; Hudson, J.A.; Oudenhuijzen, A.

    2006-01-01

    From 1997 through 2002, the Air Force Research Lab and TNO Defence, Security and Safety (Business Unit Human Factors) were involved in a series of tests to quantify the accuracy of five Human Modeling Systems (HMSs) in determining accommodation limits of ejection seat aircraft. The results of these

  1. How well can a convection-permitting climate model reproduce decadal statistics of precipitation, temperature and cloud characteristics?

    Science.gov (United States)

    Brisson, Erwan; Van Weverberg, Kwinten; Demuzere, Matthias; Devis, Annemarie; Saeed, Sajjad; Stengel, Martin; van Lipzig, Nicole P. M.

    2016-11-01

    Convection-permitting climate model are promising tools for improved representation of extremes, but the number of regions for which these models have been evaluated are still rather limited to make robust conclusions. In addition, an integrated interpretation of near-surface characteristics (typically temperature and precipitation) together with cloud properties is limited. The objective of this paper is to comprehensively evaluate the performance of a `state-of-the-art' regional convection-permitting climate model for a mid-latitude coastal region with little orographic forcing. For this purpose, an 11-year integration with the COSMO-CLM model at Convection-Permitting Scale (CPS) using a grid spacing of 2.8 km was compared with in-situ and satellite-based observations of precipitation, temperature, cloud properties and radiation (both at the surface and the top of the atmosphere). CPS clearly improves the representation of precipitation, in especially the diurnal cycle, intensity and spatial distribution of hourly precipitation. Improvements in the representation of temperature are less obvious. In fact the CPS integration overestimates both low and high temperature extremes. The underlying cause for the overestimation of high temperature extremes was attributed to deficiencies in the cloud properties: The modelled cloud fraction is only 46 % whereas a cloud fraction of 65 % was observed. Surprisingly, the effect of this deficiency was less pronounced at the radiation balance at the top of the atmosphere due to a compensating error, in particular an overestimation of the reflectivity of clouds when they are present. Overall, a better representation of convective precipitation and a very good representation of the daily cycle in different cloud types were demonstrated. However, to overcome remaining deficiencies, additional efforts are necessary to improve cloud characteristics in CPS. This will be a challenging task due to compensating deficiencies that currently

  2. Properties of galaxies reproduced by a hydrodynamic simulation

    CERN Document Server

    Vogelsberger, Mark; Springel, Volker; Torrey, Paul; Sijacki, Debora; Xu, Dandan; Snyder, Gregory F; Bird, Simeon; Nelson, Dylan; Hernquist, Lars

    2014-01-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies due to numerical inaccuracies and incomplete physical models. Moreover, because of computational constraints, they were unable to track the small scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a volume of $(106.5\\,{\\rm Mpc})^3$. It yields a reasonable population of ellipticals and spirals, reproduces the distribution of galaxies in clusters and statistics of hydrogen on large scales, and at the same time the metal and hydrogen content of galaxies on small scales.

  3. X-Ray Emitting GHz-Peaked Spectrum Galaxies: Testing a Dynamical-Radiative Model with Broad-Band Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Ostorero, L.; /Turin U. /INFN, Turin; Moderski, R.; /Warsaw, Copernicus Astron. Ctr. /KIPAC, Menlo Park; Stawarz, L.; /KIPAC, Menlo Park /Jagiellonian U., Astron. Observ.; Diaferio, A.; /Turin U. /INFN, Turin; Kowalska, I.; /Warsaw U. Observ.; Cheung, C.C.; /NASA, Goddard /Naval Research Lab, Wash., D.C.; Kataoka, J.; /Waseda U., RISE; Begelman, M.C.; /JILA, Boulder; Wagner, S.J.; /Heidelberg Observ.

    2010-06-07

    In a dynamical-radiative model we recently developed to describe the physics of compact, GHz-Peaked-Spectrum (GPS) sources, the relativistic jets propagate across the inner, kpc-sized region of the host galaxy, while the electron population of the expanding lobes evolves and emits synchrotron and inverse-Compton (IC) radiation. Interstellar-medium gas clouds engulfed by the expanding lobes, and photoionized by the active nucleus, are responsible for the radio spectral turnover through free-free absorption (FFA) of the synchrotron photons. The model provides a description of the evolution of the GPS spectral energy distribution (SED) with the source expansion, predicting significant and complex high-energy emission, from the X-ray to the {gamma}-ray frequency domain. Here, we test this model with the broad-band SEDs of a sample of eleven X-ray emitting GPS galaxies with Compact-Symmetric-Object (CSO) morphology, and show that: (i) the shape of the radio continuum at frequencies lower than the spectral turnover is indeed well accounted for by the FFA mechanism; (ii) the observed X-ray spectra can be interpreted as non-thermal radiation produced via IC scattering of the local radiation fields off the lobe particles, providing a viable alternative to the thermal, accretion-disk dominated scenario. We also show that the relation between the hydrogen column densities derived from the X-ray (N{sub H}) and radio (N{sub HI}) data of the sources is suggestive of a positive correlation, which, if confirmed by future observations, would provide further support to our scenario of high-energy emitting lobes.

  4. Synchronous Chaos and Broad Band Gamma Rhythm in a Minimal Multi-Layer Model of Primary Visual Cortex

    Science.gov (United States)

    Battaglia, Demian; Hansel, David

    2011-01-01

    Visually induced neuronal activity in V1 displays a marked gamma-band component which is modulated by stimulus properties. It has been argued that synchronized oscillations contribute to these gamma-band activity. However, analysis of Local Field Potentials (LFPs) across different experiments reveals considerable diversity in the degree of oscillatory behavior of this induced activity. Contrast-dependent power enhancements can indeed occur over a broad band in the gamma frequency range and spectral peaks may not arise at all. Furthermore, even when oscillations are observed, they undergo temporal decorrelation over very few cycles. This is not easily accounted for in previous network modeling of gamma oscillations. We argue here that interactions between cortical layers can be responsible for this fast decorrelation. We study a model of a V1 hypercolumn, embedding a simplified description of the multi-layered structure of the cortex. When the stimulus contrast is low, the induced activity is only weakly synchronous and the network resonates transiently without developing collective oscillations. When the contrast is high, on the other hand, the induced activity undergoes synchronous oscillations with an irregular spatiotemporal structure expressing a synchronous chaotic state. As a consequence the population activity undergoes fast temporal decorrelation, with concomitant rapid damping of the oscillations in LFPs autocorrelograms and peak broadening in LFPs power spectra. We show that the strength of the inter-layer coupling crucially affects this spatiotemporal structure. We predict that layer VI inactivation should induce global changes in the spectral properties of induced LFPs, reflecting their slower temporal decorrelation in the absence of inter-layer feedback. Finally, we argue that the mechanism underlying the emergence of synchronous chaos in our model is in fact very general. It stems from the fact that gamma oscillations induced by local delayed

  5. Synchronous chaos and broad band gamma rhythm in a minimal multi-layer model of primary visual cortex.

    Directory of Open Access Journals (Sweden)

    Demian Battaglia

    2011-10-01

    Full Text Available Visually induced neuronal activity in V1 displays a marked gamma-band component which is modulated by stimulus properties. It has been argued that synchronized oscillations contribute to these gamma-band activity. However, analysis of Local Field Potentials (LFPs across different experiments reveals considerable diversity in the degree of oscillatory behavior of this induced activity. Contrast-dependent power enhancements can indeed occur over a broad band in the gamma frequency range and spectral peaks may not arise at all. Furthermore, even when oscillations are observed, they undergo temporal decorrelation over very few cycles. This is not easily accounted for in previous network modeling of gamma oscillations. We argue here that interactions between cortical layers can be responsible for this fast decorrelation. We study a model of a V1 hypercolumn, embedding a simplified description of the multi-layered structure of the cortex. When the stimulus contrast is low, the induced activity is only weakly synchronous and the network resonates transiently without developing collective oscillations. When the contrast is high, on the other hand, the induced activity undergoes synchronous oscillations with an irregular spatiotemporal structure expressing a synchronous chaotic state. As a consequence the population activity undergoes fast temporal decorrelation, with concomitant rapid damping of the oscillations in LFPs autocorrelograms and peak broadening in LFPs power spectra. We show that the strength of the inter-layer coupling crucially affects this spatiotemporal structure. We predict that layer VI inactivation should induce global changes in the spectral properties of induced LFPs, reflecting their slower temporal decorrelation in the absence of inter-layer feedback. Finally, we argue that the mechanism underlying the emergence of synchronous chaos in our model is in fact very general. It stems from the fact that gamma oscillations induced by

  6. Modeling the spectral-energy-distribution of 3C 454.3 in a "flat" broad-line-region scenario

    CERN Document Server

    Lei, Maichang

    2014-01-01

    The broad-line region (BLR) of flat-spectrum radio quasars (FSRQs) could have a "flat" geometrical structure to allow GeV gamma-ray photons escape, to produce the observed gamma-ray flares with short timescales. In this paper, we collect the quasi-simultaneous spectral energy distributions (SEDs) of the FSRQ 3C 454.3 obtained by the multi-wavelength campaigns spanning from 2007 July to 2011 January, and use a model with the "flat" structure BLR, the accretion disc and the dust torus to explain the SEDs of gamma-ray outbursts. We obtain the following results: (i) The jet is almost in equipartition between magnetic and particle energy densities during the outbursts; (ii) When the emitting region locates inside the cavity of the BLR, the covering factor $f_{\\rm BLR}$ of the BLR is very small; as the emitting region goes into the BLR structure, $f_{\\rm BLR}$ increases. (iii) The aperture angle $\\alpha$ describing the BLR structure is about $45^{\\circ}$; (iv) The central black hole (BH) mass is about $5\\times 10^{...

  7. A two-stage unsupervised learning algorithm reproduces multisensory enhancement in a neural network model of the corticotectal system.

    Science.gov (United States)

    Anastasio, Thomas J; Patton, Paul E

    2003-07-30

    Multisensory enhancement (MSE) is the augmentation of the response to sensory stimulation of one modality by stimulation of a different modality. It has been described for multisensory neurons in the deep superior colliculus (DSC) of mammals, which function to detect, and direct orienting movements toward, the sources of stimulation (targets). MSE would seem to improve the ability of DSC neurons to detect targets, but many mammalian DSC neurons are unimodal. MSE requires descending input to DSC from certain regions of parietal cortex. Paradoxically, the descending projections necessary for MSE originate from unimodal cortical neurons. MSE, and the puzzling findings associated with it, can be simulated using a model of the corticotectal system. In the model, a network of DSC units receives primary sensory input that can be augmented by modulatory cortical input. Connection weights from primary and modulatory inputs are trained in stages one (Hebb) and two (Hebb-anti-Hebb), respectively, of an unsupervised two-stage algorithm. Two-stage training causes DSC units to extract information concerning simulated targets from their inputs. It also causes the DSC to develop a mixture of unimodal and multisensory units. The percentage of DSC multisensory units is determined by the proportion of cross-modal targets and by primary input ambiguity. Multisensory DSC units develop MSE, which depends on unimodal modulatory connections. Removal of the modulatory influence greatly reduces MSE but has little effect on DSC unit responses to stimuli of a single modality. The correspondence between model and data suggests that two-stage training captures important features of self-organization in the real corticotectal system.

  8. Ultra-broad-band electrical spectroscopy of soils and sediments—a combined permittivity and conductivity model

    Science.gov (United States)

    Loewer, M.; Günther, T.; Igel, J.; Kruschwitz, S.; Martin, T.; Wagner, N.

    2017-09-01

    We combined two completely different methods measuring the frequency-dependent electrical properties of moist porous materials in order to receive an extraordinary large frequency spectrum. In the low-frequency (LF) range, complex electrical resistivity between 1 mHz and 45 kHz was measured for three different soils and sandstone, using the spectral induced polarization (SIP) method with a four electrode cell. In the high-frequency (HF) radio to microwave range, complex dielectric permittivity was measured between 1 MHz and 10 GHz for the same samples using dielectric spectroscopy by means of the coaxial transmission line technique. The combined data sets cover 13 orders of magnitude and were transferred into their equivalent expressions: the complex effective dielectric permittivity and the complex effective electrical conductivity. We applied the Kramers-Kronig relation in order to justify the validity of the data combination. A new phenomenological model that consists of both dielectric permittivity and electrical conductivity terms in a Debye- and Cole-Cole-type manner was fitted to the spectra. The combined permittivity and conductivity model accounts for the most common representations of the physical quantities with respect to the individual measuring method. A maximum number of four relaxation processes was identified in the analysed frequency range. Among these are the free water and different interfacial relaxation processes, the Maxwell-Wagner effect, the counterion relaxation in the electrical double layer and the direct-current electrical conductivity. There is evidence that free water relaxation does not affect the electrical response in the SIP range. Moreover, direct current conductivity contribution (bulk and interface) dominates the losses in the HF range. Interfacial relaxation processes with relaxations in the HF range are broadly distributed down to the LF range. The slowest observed process in the LF range has a minor contribution to the HF

  9. A first attempt to reproduce basaltic soil chronosequences using a process-based soil profile model: implications for our understanding of soil evolution

    Science.gov (United States)

    Johnson, M.; Gloor, M.; Lloyd, J.

    2012-04-01

    Soils are complex systems which hold a wealth of information on both current and past conditions and many biogeochemical processes. The ability to model soil forming processes and predict soil properties will enable us to quantify such conditions and contribute to our understanding of long-term biogeochemical cycles, particularly the carbon cycle and plant nutrient cycles. However, attempts to confront such soil model predictions with data are rare, although increasingly more data from chronosquence studies is becoming available for such a purpose. Here we present initial results of an attempt to reproduce soil properties with a process-based soil evolution model similar to the model of Kirkby (1985, J. Soil Science). We specifically focus on the basaltic soils in both Hawaii and north Queensland, Australia. These soils are formed on a series of volcanic lava flows which provide sequences of different aged soils all with a relatively uniform parent material. These soil chronosequences provide a snapshot of a soil profile during different stages of development. Steep rainfall gradients in these regions also provide a system which allows us to test the model's ability to reproduce soil properties under differing climates. The mechanistic, soil evolution model presented here includes the major processes of soil formation such as i) mineral weathering, ii) percolation of rainfall through the soil, iii) leaching of solutes out of the soil profile iv) surface erosion and v) vegetation and biotic interactions. The model consists of a vertical profile and assumes simple geometry with a constantly sloping surface. The timescales of interest are on the order of tens to hundreds of thousand years. The specific properties the model predicts are, soil depth, the proportion of original elemental oxides remaining in each soil layer, pH of the soil solution, organic carbon distribution and CO2 production and concentration. The presentation will focus on a brief introduction of the

  10. Decadal Variability Shown by the Arctic Ocean Hydrochemical Data and Reproduced by an Ice-Ocean Model

    Institute of Scientific and Technical Information of China (English)

    M. Ikeda; R. Colony; H. Yamaguchi; T. Ikeda

    2005-01-01

    The Arctic is experiencing a significant warming trend as well as a decadal oscillation. The atmospheric circulation represented by the Polar Vortex and the sea ice cover show decadal variabilities, while it has been difficult to reveal the decadal oscillation from the ocean interior. The recent distribution of Russian hydrochemical data collected from the Arctic Basin provides useful information on ocean interior variabilities. Silicate is used to provide the most valuable data for showing the boundary between the silicate-rich Pacific Water and the opposite Atlantic Water. Here, it is assumed that the silicate distribution receives minor influence from seasonal biological productivity and Siberian Rivers outflow. It shows a clear maximum around 100m depth in the Canada Basin, along with a vertical gradient below 100 m, which provides information on the vertical motion of the upper boundary of the Atlantic Water at a decadal time scale. The boundary shifts upward (downward), as realized by the silicate reduction (increase) at a fixed depth, responding to a more intense (weaker) Polar Vortex or a positive (negative) phase of the Arctic Oscillation. A coupled ice-ocean model is employed to reconstruct this decadal oscillation.

  11. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  12. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  13. Adaptation and Validation of QUick, Easy, New, CHEap, and Reproducible (QUENCHER) Antioxidant Capacity Assays in Model Products Obtained from Residual Wine Pomace.

    Science.gov (United States)

    Del Pino-García, Raquel; García-Lomillo, Javier; Rivero-Pérez, María D; González-SanJosé, María L; Muñiz, Pilar

    2015-08-12

    Evaluation of the total antioxidant capacity of solid matrices without extraction steps is a very interesting alternative for food researchers and also for food industries. These methodologies have been denominated QUENCHER from QUick, Easy, New, CHEap, and Reproducible assays. To demonstrate and highlight the validity of QUENCHER (Q) methods, values of Q-method validation were showed for the first time, and they were tested with products of well-known different chemical properties. Furthermore, new QUENCHER assays to measure scavenging capacity against superoxide, hydroxyl, and lipid peroxyl radicals were developed. Calibration models showed good linearity (R(2) > 0.995), proportionality and precision (CV antioxidant capacity values significantly different from those obtained with water. The dilution of samples with powdered cellulose was discouraged because possible interferences with some of the matrices analyzed may take place.

  14. Compliant bipedal model with the center of pressure excursion associated with oscillatory behavior of the center of mass reproduces the human gait dynamics.

    Science.gov (United States)

    Jung, Chang Keun; Park, Sukyung

    2014-01-03

    Although the compliant bipedal model could reproduce qualitative ground reaction force (GRF) of human walking, the model with a fixed pivot showed overestimations in stance leg rotation and the ratio of horizontal to vertical GRF. The human walking data showed a continuous forward progression of the center of pressure (CoP) during the stance phase and the suspension of the CoP near the forefoot before the onset of step transition. To better describe human gait dynamics with a minimal expense of model complexity, we proposed a compliant bipedal model with the accelerated pivot which associated the CoP excursion with the oscillatory behavior of the center of mass (CoM) with the existing simulation parameter and leg stiffness. Owing to the pivot acceleration defined to emulate human CoP profile, the arrival of the CoP at the limit of the stance foot over the single stance duration initiated the step-to-step transition. The proposed model showed an improved match of walking data. As the forward motion of CoM during single stance was partly accounted by forward pivot translation, the previously overestimated rotation of the stance leg was reduced and the corresponding horizontal GRF became closer to human data. The walking solutions of the model ranged over higher speed ranges (~1.7 m/s) than those of the fixed pivoted compliant bipedal model (~1.5m/s) and exhibited other gait parameters, such as touchdown angle, step length and step frequency, comparable to the experimental observations. The good matches between the model and experimental GRF data imply that the continuous pivot acceleration associated with CoM oscillatory behavior could serve as a useful framework of bipedal model.

  15. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  16. Can a Dusty Warm Absorber Model Reproduce the Soft X-ray Spectra of MCG-6-30-15 and Mrk 766?

    CERN Document Server

    Sako, M; Branduardi-Raymont, G; Kaastra, J S; Brinkman, A C; Page, M J; Behar, E; Paerels, F B S; Kinkhabwala, A; Liedahl, D A; Den Herder, J W A

    2003-01-01

    XMM-Newton RGS spectra of MCG-6-30-15 and Mrk 766 exhibit complex discrete structure, which was interpreted in a paper by Branduardi-Raymont et al. (2001) as evidence for the existence of relativistically broadened Lyman alpha emission from carbon, nitrogen, and oxygen, produced in the inner-most regions of an accretion disk around a Kerr black hole. This suggestion was subsequently criticized in a paper by Lee et al. (2001), who argued that for MCG-6-30-15, the Chandra HETG spectrum, which is partially overlapping the RGS in spectral coverage, is adequately fit by a dusty warm absorber model, with no relativistic line emission. We present a reanalysis of the original RGS data sets in terms of the Lee et al. (2001) model, and demonstrate that spectral models consisting of a smooth continuum with ionized and dust absorption alone cannot reproduce the RGS spectra of both objects. The original relativistic line model with warm absorption proposed by Branduardi-Raymont et al. (2001) provides a superior fit to the...

  17. Current status of the ability of the GEMS/MACC models to reproduce the tropospheric CO vertical distribution as measured by MOZAIC

    Directory of Open Access Journals (Sweden)

    N. Elguindi

    2010-10-01

    Full Text Available Vertical profiles of CO taken from the MOZAIC aircraft database are used to globally evaluate the performance of the GEMS/MACC models, including the ECMWF-Integrated Forecasting System (IFS model coupled to the CTM MOZART-3 with 4DVAR data assimilation for the year 2004. This study provides a unique opportunity to compare the performance of three offline CTMs (MOZART-3, MOCAGE and TM5 driven by the same meteorology as well as one coupled atmosphere/CTM model run with data assimilation, enabling us to assess the potential gain brought by the combination of online transport and the 4DVAR chemical satellite data assimilation.

    First we present a global analysis of observed CO seasonal averages and interannual variability for the years 2002–2007. Results show that despite the intense boreal forest fires that occurred during the summer in Alaska and Canada, the year 2004 had comparably lower tropospheric CO concentrations. Next we present a validation of CO estimates produced by the MACC models for 2004, including an assessment of their ability to transport pollutants originating from the Alaskan/Canadian wildfires. In general, all the models tend to underestimate CO. The coupled model and the CTMs perform best in Europe and the US where biases range from 0 to -25% in the free troposphere and from 0 to -50% in the surface and boundary layers (BL. Using the 4DVAR technique to assimilate MOPITT V4 CO significantly reduces biases by up to 50% in most regions. However none of the models, even the IFS-MOZART-3 coupled model with assimilation, are able to reproduce well the CO plumes originating from the Alaskan/Canadian wildfires at downwind locations in the eastern US and Europe. Sensitivity tests reveal that deficiencies in the fire emissions inventory and injection height play a role.

  18. Opening Reproducible Research

    Science.gov (United States)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  19. Reproducing the observed energy-dependent structure of Earth's electron radiation belts during storm recovery with an event-specific diffusion model

    Science.gov (United States)

    Ripoll, J.-F.; Reeves, G. D.; Cunningham, G. S.; Loridan, V.; Denton, M.; Santolík, O.; Kurth, W. S.; Kletzing, C. A.; Turner, D. L.; Henderson, M. G.; Ukhorskiy, A. Y.

    2016-06-01

    We present dynamic simulations of energy-dependent losses in the radiation belt "slot region" and the formation of the two-belt structure for the quiet days after the 1 March storm. The simulations combine radial diffusion with a realistic scattering model, based data-driven spatially and temporally resolved whistler-mode hiss wave observations from the Van Allen Probes satellites. The simulations reproduce Van Allen Probes observations for all energies and L shells (2-6) including (a) the strong energy dependence to the radiation belt dynamics (b) an energy-dependent outer boundary to the inner zone that extends to higher L shells at lower energies and (c) an "S-shaped" energy-dependent inner boundary to the outer zone that results from the competition between diffusive radial transport and losses. We find that the characteristic energy-dependent structure of the radiation belts and slot region is dynamic and can be formed gradually in ~15 days, although the "S shape" can also be reproduced by assuming equilibrium conditions. The highest-energy electrons (E > 300 keV) of the inner region of the outer belt (L ~ 4-5) also constantly decay, demonstrating that hiss wave scattering affects the outer belt during times of extended plasmasphere. Through these simulations, we explain the full structure in energy and L shell of the belts and the slot formation by hiss scattering during storm recovery. We show the power and complexity of looking dynamically at the effects over all energies and L shells and the need for using data-driven and event-specific conditions.

  20. Current status of the ability of the GEMS/MACC models to reproduce the tropospheric CO vertical distribution as measured by MOZAIC

    Directory of Open Access Journals (Sweden)

    N. Elguindi

    2010-04-01

    Full Text Available Vertical profiles of CO taken from the MOZAIC aircraft database are used to present (1 a global analysis of CO seasonal averages and interannual variability for the years 2002–2007 and (2 a global validation of CO estimates produced by the MACC models for 2004, including an assessment of their ability to transport pollutants originating from the Alaskan/Canadian wildfires. Seasonal averages and interannual variability from several MOZAIC sites representing different regions of the world show that CO concentrations are highest and most variable during the winter season. The inter-regional variability is significant with concentrations increasing eastward from Europe to Japan. The impact of the intense boreal fires, particularly in Russia, during the fall of 2002 on the Northern Hemisphere CO concentrations throughout the troposphere is well represented by the MOZAIC data.

    A global validation of the GEMS/MACC GRG models which include three stand-alone CTMs (MOZART, MOCAGE and TM5 and the coupled ECMWF Integrated Forecasting System (IFS/MOZART model with and without MOPITT CO data assimilation show that the models have a tendency to underestimate CO. The models perform best in Europe and the US where biases range from 0 to –25% in the free troposphere and from 0 to –50% in the surface and boundary layers (BL. The biases are largest in the winter and during the daytime when emissions are highest, indicating that current inventories are too low. Data assimilation is shown to reduce biases by up to 25% in some regions. The models are not able to reproduce well the CO plumes originating from the Alaskan/Canadian wildfires at downwind locations in the eastern US and Europe, not even with assimilation. Sensitivity tests reveal that this is mainly due to deficiencies in the fire emissions inventory and injection height.

  1. Reproducible research in vadose zone sciences

    Science.gov (United States)

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  2. Multiwavelength campaign on Mrk 509 XV. A global modeling of the broad emission lines in the Optical, UV and X-ray bands

    CERN Document Server

    Costantini, E; Kaastra, J S; Bianchi, S; Branduardi-Raymont, G; Cappi, M; De Marco, B; Ebrero, J; Mehdipour, M; Petrucci, P -O; Paltani, S; Ponti, G; Steenbrugge, K C; Arav, N

    2016-01-01

    We model the broad emission lines present in the optical, UV and X-ray spectra of Mrk 509, a bright type 1 Seyfert galaxy. The broad lines were simultaneously observed during a large multiwavelength campaign, using the XMM-Newton-OM for the optical lines, HST-COS for the UV lines and XMM-Newton-RGS and Epic for the X-ray lines respectively. We also used FUSE archival data for the broad lines observed in the far-ultra-violet. The goal is to find a physical connection among the lines measured at different wavelengths and determine the size and the distance from the central source of the emitting gas components. We used the "Locally optimally emission Cloud" (LOC) model which interprets the emissivity of the broad line region (BLR) as regulated by powerlaw distributions of both gas density and distances from the central source. We find that one LOC component cannot model all the lines simultaneously. In particular, we find that the X-ray and UV lines likely may originate in the more internal part of the AGN, at ...

  3. Competitive exclusion over broad spatial extents is a slow process: Evidence and implications for species distribution modeling

    Science.gov (United States)

    Yackulic, Charles B.

    2016-01-01

    There is considerable debate about the role of competition in shaping species distributions over broad spatial extents. This debate has practical implications because predicting changes in species' geographic ranges in response to ongoing environmental change would be simpler if competition could be ignored. While this debate has been the subject of many reviews, recent literature has not addressed the rates of relevant processes. This omission is surprising in that ecologists hypothesized decades ago that regional competitive exclusion is a slow process. The goal of this review is to reassess the debate under the hypothesis that competitive exclusion over broad spatial extents is a slow process.Available evidence, including simulations presented for the first time here, suggests that competitive exclusion over broad spatial extents occurs slowly over temporal extents of many decades to millennia. Ecologists arguing against an important role for competition frequently study modern patterns and/or range dynamics over periods of decades, while much of the evidence for competition shaping geographic ranges at broad spatial extents comes from paleoecological studies over time scales of centuries or longer. If competition is slow, as evidence suggests, the geographic distributions of some, perhaps many species, would continue to change over time scales of decades to millennia, even if environmental conditions did not continue to change. If the distributions of competing species are at equilibrium it is possible to predict species distributions based on observed species–environment relationships. However, disequilibrium is widespread as a result of competition and many other processes. Studies whose goal is accurate predictions over intermediate time scales (decades to centuries) should focus on factors associated with range expansion (colonization) and loss (local extinction), as opposed to current patterns. In general, understanding of modern range dynamics would be

  4. Crustal deformation in the south-central Andes backarc terranes as viewed from regional broad-band seismic waveform modelling

    Science.gov (United States)

    Alvarado, Patricia; Beck, Susan; Zandt, George; Araujo, Mario; Triep, Enrique

    2005-11-01

    The convergence between the Nazca and South America tectonic plates generates a seismically active backarc region near 31°S. Earthquake locations define the subhorizontal subducted oceanic Nazca plate at depths of 90-120 km. Another seismic region is located within the continental upper plate with events at depths Sierras Pampeanas and is responsible for the large earthquakes that have caused major human and economic losses in Argentina. South of 33°S, the intense shallow continental seismicity is more restricted to the main cordillera over a region where the subducted Nazca plate starts to incline more steeply, and there is an active volcanic arc. We operated a portable broad-band seismic network as part of the Chile-Argentina Geophysical Experiment (CHARGE) from 2000 December to 2002 May. We have studied crustal earthquakes that occurred in the back arc and under the main cordillera in the south-central Andes (29°S-36°S) recorded by the CHARGE network. We obtained the focal mechanisms and source depths for 27 (3.5 Sierras Pampeanas, over the flat-slab segment is dominated by reverse and thrust fault-plane solutions located at an average source depth of 20 km. One moderate-sized earthquake (event 02-117) is very likely related to the northern part of the Precordillera and the Sierras Pampeanas terrane boundary. Another event located near Mendoza at a greater depth (~26 km) (event 02-005) could also be associated with the same ancient suture. We found strike-slip focal mechanisms in the eastern Sierras Pampeanas and under the main cordillera with shallower focal depths of ~5-7 km. Overall, the western part of the entire region is more seismically active than the eastern part. We postulate that this is related to the presence of different pre-Andean geological terranes. We also find evidence for different average crustal models for those terranes. Better-fitting synthetic seismograms result using a higher P-wave velocity, a smaller average S-wave velocity and a

  5. Modelling photometric reverberation data -- a disk-like broad line region and larger black hole mass for 3C120

    CERN Document Server

    Nuñez, F Pozo; Ramolla, M; Westhues, C; Haas, M; Chini, R; Steenbrugge, K; Lemke, R; Murphy, M

    2013-01-01

    We consider photometric reverberation mapping, where the nuclear continuum variations are monitored via a broad band filter and the echo of emission line clouds of the broad line region (BLR) is measured with a suitable narrow band (NB) filter. We investigate how an incomplete emission line coverage by the NB filter influences the BLR size determination. This includes two basic cases: 1) a symmetric cut of the blue and red part of the line wings, and 2) the filter positioned asymmetrically to the line center so that essentially a complete half of the emission line is contained in the NB filter. We find that symmetric cutting of line wings may lead to an overestimate of the BLR size which is less than 5%. The case of asymmetric line coverage, as for our data of the Seyfert-1 galaxy 3C120, yields the BLR size with less than 1% bias. Our results suggest that any BLR size bias due to narrow-band line cut in photometric reverberation mapping is small and in most cases negligible. We use well sampled photometric re...

  6. A broad model for demand forecasting of gasoline and fuel alcohol; Um modelo abrangente para a projecao das demandas de gasolina e alcool carburante

    Energy Technology Data Exchange (ETDEWEB)

    Buonfiglio, Antonio [PETROBRAS, Paulinia, SP (Brazil). Dept. Industrial; Bajay, Sergio Valdir [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica

    1991-12-31

    Formulating a broad, mixed: econometric/end-use, demand forecasting model for gasoline and fuel alcohol is the main objective of this work. In the model, the gasoline and hydrated alcohol demands are calculated as the corresponding products if their fleet by the average car mileage, divided by the average specific mileage. Several simulations with the proposed forecasting model are carried out, within the context of alternative scenarios for the development of these competing fuels in the Brazilian market. (author) 4 refs., 1 fig., 3 tabs.

  7. Safety and Reproducibility of a Clinical Trial System Using Induced Blood Stage Plasmodium vivax Infection and Its Potential as a Model to Evaluate Malaria Transmission

    Science.gov (United States)

    Elliott, Suzanne; Sekuloski, Silvana; Sikulu, Maggy; Hugo, Leon; Khoury, David; Cromer, Deborah; Davenport, Miles; Sattabongkot, Jetsumon; Ivinson, Karen; Ockenhouse, Christian; McCarthy, James

    2016-01-01

    Background Interventions to interrupt transmission of malaria from humans to mosquitoes represent an appealing approach to assist malaria elimination. A limitation has been the lack of systems to test the efficacy of such interventions before proceeding to efficacy trials in the field. We have previously demonstrated the feasibility of induced blood stage malaria (IBSM) infection with Plasmodium vivax. In this study, we report further validation of the IBSM model, and its evaluation for assessment of transmission of P. vivax to Anopheles stephensi mosquitoes. Methods Six healthy subjects (three cohorts, n = 2 per cohort) were infected with P. vivax by inoculation with parasitized erythrocytes. Parasite growth was monitored by quantitative PCR, and gametocytemia by quantitative reverse transcriptase PCR (qRT-PCR) for the mRNA pvs25. Parasite multiplication rate (PMR) and size of inoculum were calculated by linear regression. Mosquito transmission studies were undertaken by direct and membrane feeding assays over 3 days prior to commencement of antimalarial treatment, and midguts of blood fed mosquitoes dissected and checked for presence of oocysts after 7–9 days. Results The clinical course and parasitemia were consistent across cohorts, with all subjects developing mild to moderate symptoms of malaria. No serious adverse events were reported. Asymptomatic elevated liver function tests were detected in four of six subjects; these resolved without treatment. Direct feeding of mosquitoes was well tolerated. The estimated PMR was 9.9 fold per cycle. Low prevalence of mosquito infection was observed (1.8%; n = 32/1801) from both direct (4.5%; n = 20/411) and membrane (0.9%; n = 12/1360) feeds. Conclusion The P. vivax IBSM model proved safe and reliable. The clinical course and PMR were reproducible when compared with the previous study using this model. The IBSM model presented in this report shows promise as a system to test transmission-blocking interventions

  8. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers.

    Science.gov (United States)

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-10-31

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.

  9. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  10. Evaluation of guidewire path reproducibility.

    Science.gov (United States)

    Schafer, Sebastian; Hoffmann, Kenneth R; Noël, Peter B; Ionita, Ciprian N; Dmochowski, Jacek

    2008-05-01

    The number of minimally invasive vascular interventions is increasing. In these interventions, a variety of devices are directed to and placed at the site of intervention. The device used in almost all of these interventions is the guidewire, acting as a monorail for all devices which are delivered to the intervention site. However, even with the guidewire in place, clinicians still experience difficulties during the interventions. As a first step toward understanding these difficulties and facilitating guidewire and device guidance, we have investigated the reproducibility of the final paths of the guidewire in vessel phantom models on different factors: user, materials and geometry. Three vessel phantoms (vessel diameters approximately 4 mm) were constructed having tortuousity similar to the internal carotid artery from silicon tubing and encased in Sylgard elastomer. Several trained users repeatedly passed two guidewires of different flexibility through the phantoms under pulsatile flow conditions. After the guidewire had been placed, rotational c-arm image sequences were acquired (9 in. II mode, 0.185 mm pixel size), and the phantom and guidewire were reconstructed (512(3), 0.288 mm voxel size). The reconstructed volumes were aligned. The centerlines of the guidewire and the phantom vessel were then determined using region-growing techniques. Guidewire paths appear similar across users but not across materials. The average root mean square difference of the repeated placement was 0.17 +/- 0.02 mm (plastic-coated guidewire), 0.73 +/- 0.55 mm (steel guidewire) and 1.15 +/- 0.65 mm (steel versus plastic-coated). For a given guidewire, these results indicate that the guidewire path is relatively reproducible in shape and position.

  11. The 2010 Broad Prize

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  12. The 2010 Broad Prize

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  13. Broad band shock associated noise predictions in axisymmetric and asymmetric jets using an improved turbulence scale model

    Science.gov (United States)

    Kalyan, Anuroopa; Karabasov, Sergey A.

    2017-04-01

    Supersonic jets that are subject to off-design operating conditions are marked by three distinct regions in their far-field spectra: mixing noise, screech and Broadband Shock Associated Noise (BBSAN). BBSAN is conspicuous by the prominent multiple peaks. The Morris and Miller BBSAN model that is based on an acoustic analogy, offering a straightforward implementation for RANS, forms the foundation of the present work. The analogy model robustly captures the peak frequency noise, that occurs near Strouhal number of about 1, based on the nozzle exit diameter but leads to major sound under prediction for higher frequencies. In the jet mixing noise literature, it has been shown that an inclusion of frequency dependence into the characteristic length and temporal scales of the effective noise sources improves the far-field noise predictions. In the present paper, several modifications of the original Morris and Miller model are considered that incorporate the frequency dependent scales as recommended in the jet mixing noise literature. In addition to these, a new mixed scale model is proposed that incorporates a correlation scale that depends both on the mean-flow velocity gradient and the standard mixing noise-type scaling based on the dissipation of turbulent kinetic energy. In comparison with the original Morris and Miller model, the mixed scale model shows considerable improvements in the noise predictions for the benchmark axisymmetric convergent-divergent and convergent jets. Further to this validation, the new model has been applied for improved predictions for elliptic jets of various eccentricity. It has been shown that, for the same thrust conditions, the elliptical nozzles lead to noise reduction at the source in comparison with the baseline axisymmetric jets.

  14. Tools and techniques for computational reproducibility.

    Science.gov (United States)

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  15. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  16. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  17. Model dielectric function analysis of the critical point features of silicon nanocrystal films in a broad parameter range

    Energy Technology Data Exchange (ETDEWEB)

    Agocs, Emil, E-mail: agocsemil@gmail.com [Doctoral School of Molecular and Nanotechnologies, Faculty of Information Technology, University of Pannonia, Egyetem u.10, Veszprém, H-8200 (Hungary); Research Institute for Technical Physics and Material Science (MFA), Research Centre for Natural Sciences, H-1525 Budapest, POB 49 (Hungary); Nassiopoulou, Androula G. [IMEL/NCSR Demokritos, Aghia Paraskevi, 153 10 Athens (Greece); Milita, Silvia [CNR-IMM Sezione Bologna, Via Gobetti, 40129 Bologna (Italy); Petrik, Peter [Doctoral School of Molecular and Nanotechnologies, Faculty of Information Technology, University of Pannonia, Egyetem u.10, Veszprém, H-8200 (Hungary); Research Institute for Technical Physics and Material Science (MFA), Research Centre for Natural Sciences, H-1525 Budapest, POB 49 (Hungary)

    2013-08-31

    Due to quantum-confinement the band structure of silicon nanocrystals (NCs) is different from that of bulk silicon and strongly depends on the NC size. The samples we investigated have been prepared using chemical vapor deposition and annealing allowing a good control of the parameters in terms of both thickness and NC size, being suitable as model systems. The problem of the analysis is that the critical point features of the dielectric function can only be described with acceptable accuracy when using numerous parameters. The majority of the fit parameters are describing the oscillators of different line-shapes. In this work we show how the number of fit parameters can be reduced by a systematic analysis to find non-sensitive and correlating parameters to fix and couple as much parameters as possible. - Highlights: ► Silicon nanocrystal films were measured by spectroscopic ellipsometry. ► The dielectric functions were modeled with Adachi's model dielectric function. ► We developed a parameter analysis and fitting algorithm. ► The non-sensitive parameters were coupled and neglected. ► The behaviors of key material parameters were determined.

  18. Redesign of genetically encoded biosensors for monitoring mitochondrial redox status in a broad range of model eukaryotes.

    Science.gov (United States)

    Albrecht, Simone C; Sobotta, Mirko C; Bausewein, Daniela; Aller, Isabel; Hell, Rüdiger; Dick, Tobias P; Meyer, Andreas J

    2014-03-01

    The development of genetically encoded redox biosensors has paved the way toward chemically specific, quantitative, dynamic, and compartment-specific redox measurements in cells and organisms. In particular, redox-sensitive green fluorescent proteins (roGFPs) have attracted major interest as tools to monitor biological redox changes in real time and in vivo. Most recently, the engineering of a redox relay that combines glutaredoxin (Grx) with roGFP2 as a translational fusion (Grx1-roGFP2) led to a biosensor for the glutathione redox potential (EGSH ). The expression of this probe in mitochondria is of particular interest as mitochondria are the major source of oxidants, and their redox status is closely connected to cell fate decisions. While Grx1-roGFP2 can be expressed in mammalian mitochondria, it fails to enter mitochondria in various nonmammalian model organisms. Here we report that inversion of domain order from Grx1-roGFP2 to roGFP2-Grx1 yields a biosensor with perfect mitochondrial targeting while fully maintaining its biosensor capabilities. The redesigned probe thus allows extending in vivo observations of mitochondrial redox homeostasis to important nonmammalian model organisms, particularly plants and insects.

  19. Modeling tidal freshwater marsh sustainability in the Sacramento-San Joaquin Delta under a broad suite of potential future scenarios

    Science.gov (United States)

    Swanson, Kathleen M.; Drexler, Judith Z.; Fuller, Christopher C.; Schoellhamer, David H.

    2015-01-01

    In this paper, we report on the adaptation and application of a one-dimensional marsh surface elevation model, the Wetland Accretion Rate Model of Ecosystem Resilience (WARMER), to explore the conditions that lead to sustainable tidal freshwater marshes in the Sacramento–San Joaquin Delta. We defined marsh accretion parameters to encapsulate the range of observed values over historic and modern time-scales based on measurements from four marshes in high and low energy fluvial environments as well as possible future trends in sediment supply and mean sea level. A sensitivity analysis of 450 simulations was conducted encompassing a range of eScholarship provides open access, scholarly publishing services to the University of California and delivers a dynamic research platform to scholars worldwide. porosity values, initial elevations, organic and inorganic matter accumulation rates, and sea-level rise rates. For the range of inputs considered, the magnitude of SLR over the next century was the primary driver of marsh surface elevation change. Sediment supply was the secondary control. More than 84% of the scenarios resulted in sustainable marshes with 88 cm of SLR by 2100, but only 32% and 11% of the scenarios resulted in surviving marshes when SLR was increased to 133 cm and 179 cm, respectively. Marshes situated in high-energy zones were marginally more resilient than those in low-energy zones because of their higher inorganic sediment supply. Overall, the results from this modeling exercise suggest that marshes at the upstream reaches of the Delta—where SLR may be attenuated—and high energy marshes along major channels with high inorganic sediment accumulation rates will be more resilient to global SLR in excess of 88 cm over the next century than their downstream and low-energy counterparts. However, considerable uncertainties exist in the projected rates of sea-level rise and sediment avail-ability. In addition, more research is needed to constrain future

  20. Hyperbolic L2-modules with Reproducing Kernels

    Institute of Scientific and Technical Information of China (English)

    David EELPODE; Frank SOMMEN

    2006-01-01

    Abstract In this paper, the Dirac operator on the Klein model for the hyperbolic space is considered. A function space containing L2-functions on the sphere Sm-1 in (R)m, which are boundary values of solutions for this operator, is defined, and it is proved that this gives rise to a Hilbert module with a reproducing kernel.

  1. Characterization of Disopyramide derivative ADD424042 as a non-cardiotoxic neuronal sodium channel blocker with broad-spectrum anticonvulsant activity in rodent seizure models.

    Science.gov (United States)

    Król, Marek; Ufnal, Marcin; Szulczyk, Bartłomiej; Podsadni, Piotr; Drapała, Adrian; Turło, Jadwiga; Dawidowski, Maciej

    2016-01-01

    It was reported that antiarrhythmic drugs (AADs) can be useful in controlling refractory seizures in humans or in enhancing the action of antiepileptic drugs (AEDs) in animal models. Disopyramide phosphate (DISO) is an AAD that blocks sodium channels in cardiac myocytes. We evaluated a DISO derivative, 2-(2-chlorophenyl)-2-(pyridin-2-yl)acetamide (ADD424042) for its anticonvulsant activity in a battery of rodent models of epileptic seizures. The compound displayed a broad spectrum of activity in the 'classical' models as well as in the models of pharmacoresistant seizures. Furthermore, ADD424042 showed good therapeutic indices between the anticonvulsant activity and the motor impairment. On the contrary, no anticonvulsant effects but severe lethality were observed in the primary anticonvulsant testing of the parent DISO. By performing the whole-cell voltage-clamp experiments in dispersed cortical neurons we demonstrated that ADD424042 decreased the maximal amplitude of voltage-gated sodium channels with an IC50 value in nM range. Moreover, the compound enhanced use-dependent block and decreased excitability in pyramidal neurons in the current-clamp experiments in cortical slices. Importantly, we found that ADD424042 possessed either no, or very small cardiotoxic effect. In contrast to DISO, ADD424042 did not produce any apparent changes in electrocardiogram (ECG) and arterial blood pressure recordings. ADD424042 had no effect on QT and corrected QT intervals, at a dose which was 15 times higher than ED50 for the anticonvulsant effect in the MES model. Taken together, these data suggest that ADD424042 has the potential to become a lead structure for novel broadly acting AEDs with wide margin of cardiac safety.

  2. Pseudomonas fluorescens HK44: Lessons Learned from a Model Whole-Cell Bioreporter with a Broad Application History

    Directory of Open Access Journals (Sweden)

    Gary S. Sayler

    2012-02-01

    Full Text Available Initially described in 1990, Pseudomonas fluorescens HK44 served as the first whole-cell bioreporter genetically endowed with a bioluminescent (luxCDABE phenotype directly linked to a catabolic (naphthalene degradative pathway. HK44 was the first genetically engineered microorganism to be released in the field to monitor bioremediation potential. Subsequent to that release, strain HK44 had been introduced into other solids (soils, sands, liquid (water, wastewater, and volatile environments. In these matrices, it has functioned as one of the best characterized chemically-responsive environmental bioreporters and as a model organism for understanding bacterial colonization and transport, cell immobilization strategies, and the kinetics of cellular bioluminescent emission. This review summarizes the characteristics of P. fluorescens HK44 and the extensive range of its applications with special focus on the monitoring of bioremediation processes and biosensing of environmental pollution.

  3. Pseudomonas fluorescens HK44: Lessons Learned from a Model Whole-Cell Bioreporter with a Broad Application History

    Science.gov (United States)

    Trögl, Josef; Chauhan, Archana; Ripp, Steven; Layton, Alice C.; Kuncová, Gabriela; Sayler, Gary S.

    2012-01-01

    Initially described in 1990, Pseudomonas fluorescens HK44 served as the first whole-cell bioreporter genetically endowed with a bioluminescent (luxCDABE) phenotype directly linked to a catabolic (naphthalene degradative) pathway. HK44 was the first genetically engineered microorganism to be released in the field to monitor bioremediation potential. Subsequent to that release, strain HK44 had been introduced into other solids (soils, sands), liquid (water, wastewater), and volatile environments. In these matrices, it has functioned as one of the best characterized chemically-responsive environmental bioreporters and as a model organism for understanding bacterial colonization and transport, cell immobilization strategies, and the kinetics of cellular bioluminescent emission. This review summarizes the characteristics of P. fluorescens HK44 and the extensive range of its applications with special focus on the monitoring of bioremediation processes and biosensing of environmental pollution. PMID:22438725

  4. Model-based drug development: strengths, weaknesses, opportunities, and threats for broad application of pharmacometrics in drug development.

    Science.gov (United States)

    Wetherington, Jeffrey D; Pfister, Marc; Banfield, Christopher; Stone, Julie A; Krishna, Rajesh; Allerheiligen, Sandy; Grasela, Dennis M

    2010-09-01

    Systematic implementation of model-based drug development (MBDD) to drug discovery and development has the potential to significantly increase the rate of medical breakthroughs and make available new and better treatments to patients. An analysis of the strengths, weaknesses, opportunities, and threats (ie, SWOT) was conducted through focus group discussions that included 24 members representing 8 pharmaceutical companies to systematically assess the challenges to implementing MBDD into the drug development decision-making process. The application of the SWOT analysis to the successful implementation of MBDD yielded 19 strengths, 27 weaknesses, 34 opportunities, and 22 threats, which support the following conclusions. The shift from empirical drug development to MBDD requires a question-based mentality; early, proactive planning; dynamic access to multisource data; quantitative knowledge integration; multidisciplinary collaboration; effective communication and leadership skills; and innovative, impactful application of pharmacometrics focused on enhancing quantitative decision making. The ultimate goal of MBDD is to streamline discovery and development of innovative medicines to benefit patients.

  5. Uncertainties of isoprene emissions in the MEGAN model estimated for a coniferous and broad-leaved mixed forest in Southern China

    Science.gov (United States)

    Situ, Shuping; Wang, Xuemei; Guenther, Alex; Zhang, Yanli; Wang, Xinming; Huang, Minjuan; Fan, Qi; Xiong, Zhe

    2014-12-01

    With local observed emission factor and meteorological data, this study constrained the Model of Emissions of Gases and Aerosols from Nature (MEGAN) v2.1 to estimate isoprene emission from the Dinghushan forest during fall 2008 and quantify the uncertainties associated with MEGAN parameters using Monte Carlo approach. Compared with observation-based isoprene emission data originated from a campaign during this period at this site, the local constrained MEGAN tends to reproduce the diurnal variations and magnitude of isoprene emission reasonably well, with correlation coefficient of 0.7 and mean bias of 47.5%. The results also indicate high uncertainties in isoprene emission estimated, with the relative error varied from -89.0-111.0% at the 95% confidence interval. The key uncertainty sources include emission factors, γTLD, photosynthetically active radiation (PAR) and temperature. This implies that accurate input of emission factor, PAR and temperature is a key approach to reduce uncertainties in isoprene emission estimation.

  6. Computational modelling of the cerebral cortical microvasculature: effect of x-ray microbeams versus broad beam irradiation

    Science.gov (United States)

    Merrem, A.; Bartzsch, S.; Laissue, J.; Oelfke, U.

    2017-05-01

    Microbeam Radiation Therapy is an innovative pre-clinical strategy which uses arrays of parallel, tens of micrometres wide kilo-voltage photon beams to treat tumours. These x-ray beams are typically generated on a synchrotron source. It was shown that these beam geometries allow exceptional normal tissue sparing from radiation damage while still being effective in tumour ablation. A final biological explanation for this enhanced therapeutic ratio has still not been found, some experimental data support an important role of the vasculature. In this work, the effect of microbeams on a normal microvascular network of the cerebral cortex was assessed in computer simulations and compared to the effect of homogeneous, seamless exposures at equal energy absorption. The anatomy of a cerebral microvascular network and the inflicted radiation damage were simulated to closely mimic experimental data using a novel probabilistic model of radiation damage to blood vessels. It was found that the spatial dose fractionation by microbeam arrays significantly decreased the vascular damage. The higher the peak-to-valley dose ratio, the more pronounced the sparing effect. Simulations of the radiation damage as a function of morphological parameters of the vascular network demonstrated that the distribution of blood vessel radii is a key parameter determining both the overall radiation damage of the vasculature and the dose-dependent differential effect of microbeam irradiation.

  7. Universal or Specific? A Modeling-Based Comparison of Broad-Spectrum Influenza Vaccines against Conventional, Strain-Matched Vaccines.

    Directory of Open Access Journals (Sweden)

    Rahul Subramanian

    2016-12-01

    Full Text Available Despite the availability of vaccines, influenza remains a major public health challenge. A key reason is the virus capacity for immune escape: ongoing evolution allows the continual circulation of seasonal influenza, while novel influenza viruses invade the human population to cause a pandemic every few decades. Current vaccines have to be updated continually to keep up to date with this antigenic change, but emerging 'universal' vaccines-targeting more conserved components of the influenza virus-offer the potential to act across all influenza A strains and subtypes. Influenza vaccination programmes around the world are steadily increasing in their population coverage. In future, how might intensive, routine immunization with novel vaccines compare against similar mass programmes utilizing conventional vaccines? Specifically, how might novel and conventional vaccines compare, in terms of cumulative incidence and rates of antigenic evolution of seasonal influenza? What are their potential implications for the impact of pandemic emergence? Here we present a new mathematical model, capturing both transmission dynamics and antigenic evolution of influenza in a simple framework, to explore these questions. We find that, even when matched by per-dose efficacy, universal vaccines could dampen population-level transmission over several seasons to a greater extent than conventional vaccines. Moreover, by lowering opportunities for cross-protective immunity in the population, conventional vaccines could allow the increased spread of a novel pandemic strain. Conversely, universal vaccines could mitigate both seasonal and pandemic spread. However, where it is not possible to maintain annual, intensive vaccination coverage, the duration and breadth of immunity raised by universal vaccines are critical determinants of their performance relative to conventional vaccines. In future, conventional and novel vaccines are likely to play complementary roles in

  8. Three-feature model to reproduce the topology of citation networks and the effects from authors' visibility on their h-index

    CERN Document Server

    Amancio, Diego R; Costa, Luciano da F; 10.1016/j.joi.2012.02.005

    2013-01-01

    Various factors are believed to govern the selection of references in citation networks, but a precise, quantitative determination of their importance has remained elusive. In this paper, we show that three factors can account for the referencing pattern of citation networks for two topics, namely "graphenes" and "complex networks", thus allowing one to reproduce the topological features of the networks built with papers being the nodes and the edges established by citations. The most relevant factor was content similarity, while the other two - in-degree (i.e. citation counts) and {age of publication} had varying importance depending on the topic studied. This dependence indicates that additional factors could play a role. Indeed, by intuition one should expect the reputation (or visibility) of authors and/or institutions to affect the referencing pattern, and this is only indirectly considered via the in-degree that should correlate with such reputation. Because information on reputation is not readily avai...

  9. The Broad Foundations, 2006

    Science.gov (United States)

    Broad Foundation, 2006

    2006-01-01

    The mission of the Broad Foundations is to transform K-12 urban public education through better governance, management, labor relations and competition; make significant contributions to advance major scientific and medical research; foster public appreciation of contemporary art by increasing access for audiences worldwide; and lead and…

  10. A method to isolate bacterial communities and characterize ecosystems from food products: Validation and utilization in as a reproducible chicken meat model.

    Science.gov (United States)

    Rouger, Amélie; Remenant, Benoit; Prévost, Hervé; Zagorec, Monique

    2017-04-17

    Influenced by production and storage processes and by seasonal changes the diversity of meat products microbiota can be very variable. Because microbiotas influence meat quality and safety, characterizing and understanding their dynamics during processing and storage is important for proposing innovative and efficient storage conditions. Challenge tests are usually performed using meat from the same batch, inoculated at high levels with one or few strains. Such experiments do not reflect the true microbial situation, and the global ecosystem is not taken into account. Our purpose was to constitute live stocks of chicken meat microbiotas to create standard and reproducible ecosystems. We searched for the best method to collect contaminating bacterial communities from chicken cuts to store as frozen aliquots. We tested several methods to extract DNA of these stored communities for subsequent PCR amplification. We determined the best moment to collect bacteria in sufficient amounts during the product shelf life. Results showed that the rinsing method associated to the use of Mobio DNA extraction kit was the most reliable method to collect bacteria and obtain DNA for subsequent PCR amplification. Then, 23 different chicken meat microbiotas were collected using this procedure. Microbiota aliquots were stored at -80°C without important loss of viability. Their characterization by cultural methods confirmed the large variability (richness and abundance) of bacterial communities present on chicken cuts. Four of these bacterial communities were used to estimate their ability to regrow on meat matrices. Challenge tests performed on sterile matrices showed that these microbiotas were successfully inoculated and could overgrow the natural microbiota of chicken meat. They can therefore be used for performing reproducible challenge tests mimicking a true meat ecosystem and enabling the possibility to test the influence of various processing or storage conditions on complex meat

  11. A comparison of simple and realistic eye models for calculation of fluence to dose conversion coefficients in a broad parallel beam incident of protons

    Science.gov (United States)

    Sakhaee, Mahmoud; Vejdani-Noghreiyan, Alireza; Ebrahimi-Khankook, Atiyeh

    2015-01-01

    Radiation induced cataract has been demonstrated among people who are exposed to ionizing radiation. To evaluate the deterministic effects of ionizing radiation on the eye lens, several papers dealing with the eye lens dose have been published. ICRP Publication 103 states that the lens of the eye may be more radiosensitive than previously considered. Detailed investigation of the response of the lens showed that there are strong differences in sensitivity to ionizing radiation exposure with respect to cataract induction among the tissues of the lens of the eye. This motivated several groups to look deeper into issue of the dose to a sensitive cell population within the lens, especially for radiations with low energy penetrability that have steep dose gradients inside the lens. Two sophisticated mathematical models of the eye including the inner structure have been designed for the accurate dose estimation in recent years. This study focuses on the calculations of the absorbed doses of different parts of the eye using the stylized models located in UF-ORNL phantom and comparison with the data calculated with the reference computational phantom in a broad parallel beam incident of protons with energies between 20 MeV and 10 GeV. The obtained results indicate that the total lens absorbed doses of reference phantom has good compliance with those of the more sensitive regions of stylized models. However, total eye absorbed dose of these models greatly differ with each other for lower energies.

  12. Constraining UV continuum slopes of active galactic nuclei with cloudy models of broad-line region extreme-ultraviolet emission lines

    Energy Technology Data Exchange (ETDEWEB)

    Moloney, Joshua [CASA, Department of Astrophysical and Planetary Sciences, University of Colorado, Boulder, CO 80309 (United States); Michael Shull, J., E-mail: joshua.moloney@colorado.edu, E-mail: michael.shull@colorado.edu [Also at Institute of Astronomy, University of Cambridge, Cambridge CB3 0HA, UK. (United Kingdom)

    2014-10-01

    Understanding the composition and structure of the broad-line region (BLR) of active galactic nuclei (AGNs) is important for answering many outstanding questions in supermassive black hole evolution, galaxy evolution, and ionization of the intergalactic medium. We used single-epoch UV spectra from the Cosmic Origins Spectrograph (COS) on the Hubble Space Telescope to measure EUV emission-line fluxes from four individual AGNs with 0.49 ≤ z ≤ 0.64, two AGNs with 0.32 ≤ z ≤ 0.40, and a composite of 159 AGNs. With the CLOUDY photoionization code, we calculated emission-line fluxes from BLR clouds with a range of density, hydrogen ionizing flux, and incident continuum spectral indices. The photoionization grids were fit to the observations using single-component and locally optimally emitting cloud (LOC) models. The LOC models provide good fits to the measured fluxes, while the single-component models do not. The UV spectral indices preferred by our LOC models are consistent with those measured from COS spectra. EUV emission lines such as N IV λ765, O II λ833, and O III λ834 originate primarily from gas with electron temperatures between 37,000 K and 55,000 K. This gas is found in BLR clouds with high hydrogen densities (n {sub H} ≥ 10{sup 12} cm{sup –3}) and hydrogen ionizing photon fluxes (Φ{sub H} ≥ 10{sup 22} cm{sup –2} s{sup –1}).

  13. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  14. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes...

  15. [Reproducibility of subjective refraction measurement].

    Science.gov (United States)

    Grein, H-J; Schmidt, O; Ritsche, A

    2014-11-01

    Reproducibility of subjective refraction measurement is limited by various factors. The main factors affecting reproducibility include the characteristics of the measurement method and of the subject and the examiner. This article presents the results of a study on this topic, focusing on the reproducibility of subjective refraction measurement in healthy eyes. The results of previous studies are not all presented in the same way by the respective authors and cannot be fully standardized without consulting the original scientific data. To the extent that they are comparable, the results of our study largely correspond largely with those of previous investigations: During repeated subjective refraction measurement, 95% of the deviation from the mean value was approximately ±0.2 D to ±0.65 D for the spherical equivalent and cylindrical power. The reproducibility of subjective refraction measurement in healthy eyes is limited, even under ideal conditions. Correct assessment of refraction results is only feasible after identifying individual variability. Several measurements are required. Refraction cannot be measured without a tolerance range. The English full-text version of this article is available at SpringerLink (under supplemental).

  16. Reproducible research in computational science.

    Science.gov (United States)

    Peng, Roger D

    2011-12-02

    Computational science has led to exciting new developments, but the nature of the work has exposed limitations in our ability to evaluate published findings. Reproducibility has the potential to serve as a minimum standard for judging scientific claims when full independent replication of a study is not possible.

  17. A First-Principles Spectral Model for Blazar Jet Acceleration and Emission with Klein-Nishina Scattering of Multiple Broad Line Region Emission Lines

    Science.gov (United States)

    Lewis, Tiffany R.; Finke, Justin; Becker, Peter A.

    2017-08-01

    Blazars are a sub-class of active galactic nuclei, with a polar jet aligned along our line of sight. Emission from blazar jets is observed across the electromagnetic spectrum. In our model we assume that the emission emanates from one homogeneous zone in the jet, which is in the process of passing through the Broad Line Region (BLR). We start from first-principles to build up a particle transport model, whose solution is the electron distribution, rather than assuming a convenient functional form. Our transport model considers shock acceleration, adiabatic expansion, stochastic acceleration, Bohm diffusion, synchrotron radiation, and Klein-Nishina radiation pulling seed photons from the BLR and dusty torus. We obtain the steady-state electron distribution computationally, and calculate individual spectral contributions due to synchrotron with self-absorption, disk, synchrotron self-Compton, and external-Compton emission, using numerical integration. We compare the resulting radiation spectrum with multi-wavelength data for 3C 279, during quiescence and two flares. Our preliminary results suggest that the jet emission is produced in a region with a sub-equipartition magnetic field, and that the magnetic field in the jet decreases during flaring events, implying that reconnection may play a role in blazar flares.

  18. Efficacy of oral E1210, a new broad-spectrum antifungal with a novel mechanism of action, in murine models of candidiasis, aspergillosis, and fusariosis.

    Science.gov (United States)

    Hata, Katsura; Horii, Takaaki; Miyazaki, Mamiko; Watanabe, Nao-Aki; Okubo, Miyuki; Sonoda, Jiro; Nakamoto, Kazutaka; Tanaka, Keigo; Shirotori, Syuji; Murai, Norio; Inoue, Satoshi; Matsukura, Masayuki; Abe, Shinya; Yoshimatsu, Kentaro; Asada, Makoto

    2011-10-01

    E1210 is a first-in-class, broad-spectrum antifungal with a novel mechanism of action-inhibition of fungal glycosylphosphatidylinositol biosynthesis. In this study, the efficacies of E1210 and reference antifungals were evaluated in murine models of oropharyngeal and disseminated candidiasis, pulmonary aspergillosis, and disseminated fusariosis. Oral E1210 demonstrated dose-dependent efficacy in infections caused by Candida species, Aspergillus spp., and Fusarium solani. In the treatment of oropharyngeal candidiasis, E1210 and fluconazole each caused a significantly greater reduction in the number of oral CFU than the control treatment (P candidiasis model, mice treated with E1210, fluconazole, caspofungin, or liposomal amphotericin B showed significantly higher survival rates than the control mice (P candidiasis caused by azole-resistant Candida albicans or Candida tropicalis. A 24-h delay in treatment onset minimally affected the efficacy outcome of E1210 in the treatment of disseminated candidiasis. In the Aspergillus flavus pulmonary aspergillosis model, mice treated with E1210, voriconazole, or caspofungin showed significantly higher survival rates than the control mice (P candidiasis, pulmonary aspergillosis, and disseminated fusariosis. These data suggest that further studies to determine E1210's potential for the treatment of disseminated fungal infections are indicated.

  19. Broad Diphotons from Narrow States

    CERN Document Server

    An, Haipeng; Zhang, Yue

    2015-01-01

    ATLAS and CMS have each reported a modest diphoton excess consistent with the decay of a broad resonance at ~ 750 GeV. We show how this signal can arise in a weakly coupled theory comprised solely of narrow width particles. In particular, if the decaying particle is produced off-shell, then the associated diphoton resonance will have a broad, adjustable width. We present simplified models which explain the diphoton excess through the three-body decay of a scalar or fermion. Our minimal ultraviolet completion is a weakly coupled and renormalizable theory of a singlet scalar plus a heavy vector-like quark and lepton. The smoking gun of this mechanism is an asymmetric diphoton peak recoiling against missing transverse energy, jets, or leptons.

  20. Broad-band strong motion simulations coupling k-square kinematic source models with empirical Green's functions: the 2009 L'Aquila earthquake

    Science.gov (United States)

    Del Gaudio, Sergio; Causse, Mathieu; Festa, Gaetano

    2015-10-01

    The use of simulated accelerograms may improve the evaluation of the seismic hazard when an accurate modelling of both source and propagation is performed. In this paper, we performed broad-band simulations of the 2009, M 6.3 L'Aquila earthquake, coupling a k-2 kinematic model for the seismic source with empirical Green's functions (EGFs) as propagators. We extracted 10 EGFs candidates from a database of aftershocks satisfying quality criteria based on signal-to-noise ratio, fault proximity, small magnitude, similar focal mechanism and stress drop. For comparison with real observations, we also derived a low-frequency kinematic model, based on inversion of ground displacement as integrated from strong motion data. Kinematic properties of the inverted model (rupture velocity, position of the rupture nucleation, low-frequency slip and roughness degree of slip heterogeneity) were used as constraints in the k-2 model, to test the use of a single specific EGF against the use of the whole set of EGFs. Comparison to real observations based on spectral and peak ground acceleration shows that the use of all available EGFs improves the fit of simulations to real data. Moreover the epistemic variability related to the selection of a specific EGF is significantly larger (two to three times) than recent observations of between event variability, that is the variability associated with the randomness of the rupture process. We finally performed `blind' simulations releasing all the information on source kinematics and only considering the fault geometry and the magnitude of the target event as known features. We computed peak ground acceleration, acceleration Fourier and response spectra. Simulations follow the same trend with distance as real observations. In most cases these latter fall within one sigma from predictions. Predictions with source parameters constrained at low frequency do not perform better than `blind' simulations, showing that extrapolation of the low

  1. Modelling the variable broad-band optical/UV/X-ray spectrum of PG1211+143: Implications for the ionized outflow

    CERN Document Server

    Papadakis, I E; Panagiotou, C

    2016-01-01

    We present the results from a detailed analysis of the 2007 Swift monitoring campaign of the quasar PG1211+143. We constructed broad-band, optical/UV/X-ray spectral energy distributions over three X-ray flux intervals. We fitted them with a model which accounts for the disc and the X-ray coronal emission and the warm absorber (well established in this source). The three flux spectra are well fitted by the model we considered. The disc inner temperature remains constant at ~2 eV, while X-rays are variable both in spectral slope and normalization. The absorber covers almost 90% of the central source. It is outflowing with a velocity less than 2.3*10^4 km/s (3sigma upper limit), and has a column density of ~10^23.2. Its ionization parameter varies by a factor of 1.6, and it is in photo-ionizing equilibrium with the ionizing flux. It is located at a distance of less than 0.35 pc from the central source and its relative thickness, DR/R is less than 0.1. The absorber' s ionization parameter variations can explain t...

  2. Accuracy and reproducibility of patient-specific hemodynamic models of stented intracranial aneurysms: report on the Virtual Intracranial Stenting Challenge 2011.

    Science.gov (United States)

    Cito, S; Geers, A J; Arroyo, M P; Palero, V R; Pallarés, J; Vernet, A; Blasco, J; San Román, L; Fu, W; Qiao, A; Janiga, G; Miura, Y; Ohta, M; Mendina, M; Usera, G; Frangi, A F

    2015-01-01

    Validation studies are prerequisites for computational fluid dynamics (CFD) simulations to be accepted as part of clinical decision-making. This paper reports on the 2011 edition of the Virtual Intracranial Stenting Challenge. The challenge aimed to assess the reproducibility with which research groups can simulate the velocity field in an intracranial aneurysm, both untreated and treated with five different configurations of high-porosity stents. Particle imaging velocimetry (PIV) measurements were obtained to validate the untreated velocity field. Six participants, totaling three CFD solvers, were provided with surface meshes of the vascular geometry and the deployed stent geometries, and flow rate boundary conditions for all inlets and outlets. As output, they were invited to submit an abstract to the 8th International Interdisciplinary Cerebrovascular Symposium 2011 (ICS'11), outlining their methods and giving their interpretation of the performance of each stent configuration. After the challenge, all CFD solutions were collected and analyzed. To quantitatively analyze the data, we calculated the root-mean-square error (RMSE) over uniformly distributed nodes on a plane slicing the main flow jet along its axis and normalized it with the maximum velocity on the slice of the untreated case (NRMSE). Good agreement was found between CFD and PIV with a NRMSE of 7.28%. Excellent agreement was found between CFD solutions, both untreated and treated. The maximum difference between any two groups (along a line perpendicular to the main flow jet) was 4.0 mm/s, i.e. 4.1% of the maximum velocity of the untreated case, and the average NRMSE was 0.47% (range 0.28-1.03%). In conclusion, given geometry and flow rates, research groups can accurately simulate the velocity field inside an intracranial aneurysm-as assessed by comparison with in vitro measurements-and find excellent agreement on the hemodynamic effect of different stent configurations.

  3. Reproducibility of NIF hohlraum measurements

    Science.gov (United States)

    Moody, J. D.; Ralph, J. E.; Turnbull, D. P.; Casey, D. T.; Albert, F.; Bachmann, B. L.; Doeppner, T.; Divol, L.; Grim, G. P.; Hoover, M.; Landen, O. L.; MacGowan, B. J.; Michel, P. A.; Moore, A. S.; Pino, J. E.; Schneider, M. B.; Tipton, R. E.; Smalyuk, V. A.; Strozzi, D. J.; Widmann, K.; Hohenberger, M.

    2015-11-01

    The strategy of experimentally ``tuning'' the implosion in a NIF hohlraum ignition target towards increasing hot-spot pressure, areal density of compressed fuel, and neutron yield relies on a level of experimental reproducibility. We examine the reproducibility of experimental measurements for a collection of 15 identical NIF hohlraum experiments. The measurements include incident laser power, backscattered optical power, x-ray measurements, hot-electron fraction and energy, and target characteristics. We use exact statistics to set 1-sigma confidence levels on the variations in each of the measurements. Of particular interest is the backscatter and laser-induced hot-spot locations on the hohlraum wall. Hohlraum implosion designs typically include variability specifications [S. W. Haan et al., Phys. Plasmas 18, 051001 (2011)]. We describe our findings and compare with the specifications. This work was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  4. Shortening the learning curve in endoscopic endonasal skull base surgery: a reproducible polymer tumor model for the trans-sphenoidal trans-tubercular approach to retro-infundibular tumors.

    Science.gov (United States)

    Berhouma, Moncef; Baidya, Nishanta B; Ismaïl, Abdelhay A; Zhang, Jun; Ammirati, Mario

    2013-09-01

    Endoscopic endonasal skull base surgery attracts an increasing number of young neurosurgeons. This recent technique requires specific technical skills for the approaches to non-pituitary tumors (expanded endoscopic endonasal surgery). Actual residents' busy schedules carry the risk of compromising their laboratory training by limiting significantly the dedicated time for dissections. To enhance and shorten the learning curve in expanded endoscopic endonasal skull base surgery, we propose a reproducible model based on the implantation of a polymer via an intracranial route to provide a pathological retro-infundibular expansive lesion accessible to a virgin expanded endoscopic endonasal route, avoiding the ethically-debatable need to hundreds of pituitary cases in live patients before acquiring the desired skills. A polymer-based tumor model was implanted in 6 embalmed human heads via a microsurgical right fronto-temporal approach through the carotido-oculomotor cistern to mimic a retro-infundibular tumor. The tumor's position was verified by CT-scan. An endoscopic endonasal trans-sphenoidal trans-tubercular trans-planum approach was then carried out on a virgin route under neuronavigation tracking. Dissection of the tumor model from displaced surrounding neurovascular structures reproduced live surgery's sensations and challenges. Post-implantation CT-scan allowed the pre-removal assessment of the tumor insertion, its relationships as well as naso-sphenoidal anatomy in preparation of the endoscopic approach. Training on easily reproducible retro-infundibular approaches in a context of pathological distorted anatomy provides a unique opportunity to avoid the need for repetitive live surgeries to acquire skills for this kind of rare tumors, and may shorten the learning curve for endoscopic endonasal surgery. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Characterization and optimization of experimental variables within a reproducible bladder encrustation model and in vitro evaluation of the efficacy of urease inhibitors for the prevention of medical device-related encrustation.

    Science.gov (United States)

    Jones, David S; Djokic, Jasmina; Gorman, Sean P

    2006-01-01

    This study presents a reproducible, cost-effective in vitro encrustation model and, furthermore, describes the effects of components of the artificial urine and the presence of agents that modify the action of urease on encrustation on commercially available ureteral stents. The encrustation model involved the use of small-volume reactors (700 mL) containing artificial urine and employing an orbital incubator (at 37 degrees C) to ensure controlled stirring. The artificial urine contained sources of calcium and magnesium (both as chlorides), albumin and urease. Alteration of the ratio (% w/w) of calcium salt to magnesium salt affected the mass of encrustation, with the greatest encrustation noted whenever magnesium was excluded from the artificial urine. Increasing the concentration of albumin, designed to mimic the presence of protein in urine, significantly decreased the mass of both calcium and magnesium encrustation until a plateau was observed. Finally, exclusion of urease from the artificial urine significantly reduced encrustation due to the indirect effects of this enzyme on pH. Inclusion of the urease inhibitor, acetohydroxamic acid, or urease substrates (methylurea or ethylurea) into the artificial medium markedly reduced encrustation on ureteral stents. In conclusion, this study has described the design of a reproducible, cost-effective in vitro encrustation model. Encrustation was markedly reduced on biomaterials by the inclusion of agents that modify the action of urease. These agents may, therefore, offer a novel clinical approach to the control of encrustation on urological medical devices.

  6. Development of a monoclonal antibody-based broad-specificity ELISA for fluroquinolone antibiotics in foods and molecular modeling studies of cross-reactive compounds

    Science.gov (United States)

    Development of a competitive indirect enzyme-linked immunosorbent assay (ciELISA) with monoclonal antibodies (Mabs) having broad specificity for fluoroquinolone (FQ) antibiotics is described. Four FQs, ciprofloxacin (CIP), norfloxacin (NOR), enrofloxacin (ENR) and ofloxacin (OFL) were conjugated to...

  7. How well do environmental archives of atmospheric mercury deposition in the Arctic reproduce rates and trends depicted by atmospheric models and measurements?

    Science.gov (United States)

    Goodsite, M E; Outridge, P M; Christensen, J H; Dastoor, A; Muir, D; Travnikov, O; Wilson, S

    2013-05-01

    This review compares the reconstruction of atmospheric Hg deposition rates and historical trends over recent decades in the Arctic, inferred from Hg profiles in natural archives such as lake and marine sediments, peat bogs and glacial firn (permanent snowpack), against those predicted by three state-of-the-art atmospheric models based on global Hg emission inventories from 1990 onwards. Model veracity was first tested against atmospheric Hg measurements. Most of the natural archive and atmospheric data came from the Canadian-Greenland sectors of the Arctic, whereas spatial coverage was poor in other regions. In general, for the Canadian-Greenland Arctic, models provided good agreement with atmospheric gaseous elemental Hg (GEM) concentrations and trends measured instrumentally. However, there are few instrumented deposition data with which to test the model estimates of Hg deposition, and these data suggest models over-estimated deposition fluxes under Arctic conditions. Reconstructed GEM data from glacial firn on Greenland Summit showed the best agreement with the known decline in global Hg emissions after about 1980, and were corroborated by archived aerosol filter data from Resolute, Nunavut. The relatively stable or slowly declining firn and model GEM trends after 1990 were also corroborated by real-time instrument measurements at Alert, Nunavut, after 1995. However, Hg fluxes and trends in northern Canadian lake sediments and a southern Greenland peat bog did not exhibit good agreement with model predictions of atmospheric deposition since 1990, the Greenland firn GEM record, direct GEM measurements, or trends in global emissions since 1980. Various explanations are proposed to account for these discrepancies between atmosphere and archives, including problems with the accuracy of archive chronologies, climate-driven changes in Hg transfer rates from air to catchments, waters and subsequently into sediments, and post-depositional diagenesis in peat bogs

  8. A jet-dominated model for a broad-band spectral energy distribution of the nearby low-luminosity active galactic nucleus in M94

    Science.gov (United States)

    van Oers, Pieter; Markoff, Sera; Uttley, Phil; McHardy, Ian; van der Laan, Tessel; Donovan Meyer, Jennifer; Connors, Riley

    2017-06-01

    We have compiled a new multiwavelength spectral energy distribution (SED) for the closest obscured low-ionization emission-line region active galactic nucleus (AGN), NGC 4736, also known as M94. The SED comprises mainly high-resolution (mostly sub-arcsecond, or, at the distance to M94, ≲23 pc from the nucleus) observations from the literature, archival data, as well as previously unpublished sub-millimetre data from the Plateau de Bure Interferometer (PdBI) and the Combined Array for Research in Millimeter-wave Astronomy, in conjunction with new electronic MultiElement Radio Interferometric Network (e-MERLIN) L-band (1.5 GHz) observations. Thanks to the e-MERLIN resolution and sensitivity, we resolve for the first time a double structure composed of two radio sources separated by ˜1 arcsec, previously observed only at higher frequency. We explore this data set, which further includes non-simultaneous data from the Very Large Array, the Gemini telescope, the Hubble Space Telescope and the Chandra X-ray observatory, in terms of an outflow-dominated model. We compare our results with previous trends found for other AGN using the same model (NGC 4051, M81*, M87 and Sgr A*), as well as hard- and quiescent-state X-ray binaries. We find that the nuclear broad-band spectrum of M94 is consistent with a relativistic outflow of low inclination. The findings in this work add to the growing body of evidence that the physics of weakly accreting black holes scales with mass in a rather straightforward fashion.

  9. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  10. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  11. Theoretical Modeling and Computer Simulations for the Origins and Evolution of Reproducing Molecular Systems and Complex Systems with Many Interactive Parts

    Science.gov (United States)

    Liang, Shoudan

    2000-01-01

    Our research effort has produced nine publications in peer-reviewed journals listed at the end of this report. The work reported here are in the following areas: (1) genetic network modeling; (2) autocatalytic model of pre-biotic evolution; (3) theoretical and computational studies of strongly correlated electron systems; (4) reducing thermal oscillations in atomic force microscope; (5) transcription termination mechanism in prokaryotic cells; and (6) the low glutamine usage in thennophiles obtained by studying completely sequenced genomes. We discuss the main accomplishments of these publications.

  12. Modeling biophysical properties of broad-leaved stands in the hyrcanian forests of Iran using fused airborne laser scanner data and ultraCam-D images

    Science.gov (United States)

    Mohammadi, Jahangir; Shataee, Shaban; Namiranian, Manochehr; Næsset, Erik

    2017-09-01

    Inventories of mixed broad-leaved forests of Iran mainly rely on terrestrial measurements. Due to rapid changes and disturbances and great complexity of the silvicultural systems of these multilayer forests, frequent repetition of conventional ground-based plot surveys is often cost prohibitive. Airborne laser scanning (ALS) and multispectral data offer an alternative or supplement to conventional inventories in the Hyrcanian forests of Iran. In this study, the capability of a combination of ALS and UltraCam-D data to model stand volume, tree density, and basal area using random forest (RF) algorithm was evaluated. Systematic sampling was applied to collect field plot data on a 150 m × 200 m sampling grid within a 1100 ha study area located at 36°38‧- 36°42‧N and 54°24‧-54°25‧E. A total of 308 circular plots (0.1 ha) were measured for calculation of stand volume, tree density, and basal area per hectare. For each plot, a set of variables was extracted from both ALS and multispectral data. The RF algorithm was used for modeling of the biophysical properties using ALS and UltraCam-D data separately and combined. The results showed that combining the ALS data and UltraCam-D images provided a slight increase in prediction accuracy compared to separate modeling. The RMSE as percentage of the mean, the mean difference between observed and predicted values, and standard deviation of the differences using a combination of ALS data and UltraCam-D images in an independent validation at 0.1-ha plot level were 31.7%, 1.1%, and 84 m3 ha-1 for stand volume; 27.2%, 0.86%, and 6.5 m2 ha-1 for basal area, and 35.8%, -4.6%, and 77.9 n ha-1 for tree density, respectively. Based on the results, we conclude that fusion of ALS and UltraCam-D data may be useful for modeling of stand volume, basal area, and tree density and thus gain insights into structural characteristics in the complex Hyrcanian forests.

  13. Simulation of the hydrodynamic conditions of the eye to better reproduce the drug release from hydrogel contact lenses: experiments and modeling.

    Science.gov (United States)

    Pimenta, A F R; Valente, A; Pereira, J M C; Pereira, J C F; Filipe, H P; Mata, J L G; Colaço, R; Saramago, B; Serro, A P

    2016-12-01

    Currently, most in vitro drug release studies for ophthalmic applications are carried out in static sink conditions. Although this procedure is simple and useful to make comparative studies, it does not describe adequately the drug release kinetics in the eye, considering the small tear volume and flow rates found in vivo. In this work, a microfluidic cell was designed and used to mimic the continuous, volumetric flow rate of tear fluid and its low volume. The suitable operation of the cell, in terms of uniformity and symmetry of flux, was proved using a numerical model based in the Navier-Stokes and continuity equations. The release profile of a model system (a hydroxyethyl methacrylate-based hydrogel (HEMA/PVP) for soft contact lenses (SCLs) loaded with diclofenac) obtained with the microfluidic cell was compared with that obtained in static conditions, showing that the kinetics of release in dynamic conditions is slower. The application of the numerical model demonstrated that the designed cell can be used to simulate the drug release in the whole range of the human eye tear film volume and allowed to estimate the drug concentration in the volume of liquid in direct contact with the hydrogel. The knowledge of this concentration, which is significantly different from that measured in the experimental tests during the first hours of release, is critical to predict the toxicity of the drug release system and its in vivo efficacy. In conclusion, the use of the microfluidic cell in conjunction with the numerical model shall be a valuable tool to design and optimize new therapeutic drug-loaded SCLs.

  14. Accessibility and Reproducibility of Stable High-qmin Steady-State Scenarios by q-profile+βN Model Predictive Control

    Science.gov (United States)

    Schuster, E.; Wehner, W.; Holcomb, C. T.; Victor, B.; Ferron, J. R.; Luce, T. C.

    2016-10-01

    The capability of combined q-profile and βN control to enable access to and repeatability of steady-state scenarios for qmin > 1.4 discharges has been assessed in DIII-D experiments. To steer the plasma to the desired state, model predictive control (MPC) of both the q-profile and βN numerically solves successive optimization problems in real time over a receding time horizon by exploiting efficient quadratic programming techniques. A key advantage of this control approach is that it allows for explicit incorporation of state/input constraints to prevent the controller from driving the plasma outside of stability/performance limits and obtain, as closely as possible, steady state conditions. The enabler of this feedback-control approach is a control-oriented model capturing the dominant physics of the q-profile and βN responses to the available actuators. Experiments suggest that control-oriented model-based scenario planning in combination with MPC can play a crucial role in exploring stability limits of scenarios of interest. Supported by the US DOE under DE-SC0010661.

  15. Large barrier, highly uniform and reproducible Ni-Si/4H-SiC forward Schottky diode characteristics: testing the limits of Tung's model

    Science.gov (United States)

    Omar, Sabih U.; Sudarshan, Tangali S.; Rana, Tawhid A.; Song, Haizheng; Chandrashekhar, M. V. S.

    2014-07-01

    We report highly ideal (n < 1.1), uniform nickel silicide (Ni-Si)/SiC Schottky barrier (1.60-1.67 eV with a standard deviation <2.8%) diodes, fabricated on 4H-SiC epitaxial layers grown by chemical vapour deposition. The barrier height was constant over a wide epilayer doping range of 1014-1016 cm-3, apart from a slight decrease consistent with image force lowering. This remarkable uniformity was achieved by careful optimization of the annealing of the Schottky interface to minimize non-idealities that could lead to inhomogeneity. Tung's barrier inhomogeneity model was used to quantify the level of inhomogeneity in the optimized annealed diodes. The estimated ‘bulk’ barrier height (1.75 eV) was consistent with the Shockley-Mott limit for the Ni-Si/4H-SiC interface, implying an unpinned Fermi level. But the model was not useful to explain the poor ideality in unoptimized, as-deposited Schottky contacts (n = 1.6 - 2.5). We show analytically and numerically that only idealities n < 1.21 can be explained using Tung's model, irrespective of material system, indicating that the barrier height inhomogeneity is not the only cause of poor ideality in Schottky diodes. For explaining this highly non-ideal behaviour, other factors (e.g. interface traps, morphological defects, extrinsic impurities, etc) need to be considered.

  16. A novel, selective inhibitor of fibroblast growth factor receptors that shows a potent broad spectrum of antitumor activity in several tumor xenograft models.

    Science.gov (United States)

    Zhao, Genshi; Li, Wei-Ying; Chen, Daohong; Henry, James R; Li, Hong-Yu; Chen, Zhaogen; Zia-Ebrahimi, Mohammad; Bloem, Laura; Zhai, Yan; Huss, Karen; Peng, Sheng-Bin; McCann, Denis J

    2011-11-01

    The fibroblast growth factor receptors (FGFR) are tyrosine kinases that are present in many types of endothelial and tumor cells and play an important role in tumor cell growth, survival, and migration as well as in maintaining tumor angiogenesis. Overexpression of FGFRs or aberrant regulation of their activities has been implicated in many forms of human malignancies. Therefore, targeting FGFRs represents an attractive strategy for development of cancer treatment options by simultaneously inhibiting tumor cell growth, survival, and migration as well as tumor angiogenesis. Here, we describe a potent, selective, small-molecule FGFR inhibitor, (R)-(E)-2-(4-(2-(5-(1-(3,5-Dichloropyridin-4-yl)ethoxy)-1H-indazol-3yl)vinyl)-1H-pyrazol-1-yl)ethanol, designated as LY2874455. This molecule is active against all 4 FGFRs, with a similar potency in biochemical assays. It exhibits a potent activity against FGF/FGFR-mediated signaling in several cancer cell lines and shows an excellent broad spectrum of antitumor activity in several tumor xenograft models representing the major FGF/FGFR relevant tumor histologies including lung, gastric, and bladder cancers and multiple myeloma, and with a well-defined pharmacokinetic/pharmacodynamic relationship. LY2874455 also exhibits a 6- to 9-fold in vitro and in vivo selectivity on inhibition of FGF- over VEGF-mediated target signaling in mice. Furthermore, LY2874455 did not show VEGF receptor 2-mediated toxicities such as hypertension at efficacious doses. Currently, this molecule is being evaluated for its potential use in the clinic.

  17. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State.

    Science.gov (United States)

    Liu, Ying; Yen, Hai-Yun; Austria, Theresa; Pettersson, Jonas; Peti-Peterdi, Janos; Maxson, Robert; Widschwendter, Martin; Dubeau, Louis

    2015-10-01

    Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r) promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr) promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30%) rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001), attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  18. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-10-01

    Full Text Available Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30% rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001, attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  19. Global, Broad, or Specific Cognitive Differences? Using a MIMIC Model to Examine Differences in CHC Abilities in Children with Learning Disabilities

    Science.gov (United States)

    Niileksela, Christopher R.; Reynolds, Matthew R.

    2014-01-01

    This study was designed to better understand the relations between learning disabilities and different levels of latent cognitive abilities, including general intelligence (g), broad cognitive abilities, and specific abilities based on the Cattell-Horn-Carroll theory of intelligence (CHC theory). Data from the "Differential Ability…

  20. Broad Leaves in Strong Flow

    CERN Document Server

    Miller, Laura

    2013-01-01

    Flexible broad leaves are thought to reconfigure in the wind and water to reduce the drag forces that act upon them. Simple mathematical models of a flexible beam immersed in a two-dimensional flow will also exhibit this behavior. What is less understood is how the mechanical properties of a leaf in a three-dimensional flow will passively allow roll up into a cone shape and reduce both drag and vortex induced oscillations. In this fluid dynamics video, the flows around the leaves are compared with those of simplified sheets using 3D numerical simulations and physical models. For some reconfiguration shapes, large forces and oscillations due to strong vortex shedding are produced. In the actual leaf, a stable recirculation zone is formed within the wake of the reconfigured cone. In physical and numerical models that reconfigure into cones, a similar recirculation zone is observed with both rigid and flexible tethers. These results suggest that the three-dimensional cone structure in addition to flexibility is ...

  1. On The Reproducibility of Seasonal Land-surface Climate

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, T J

    2004-10-22

    The sensitivity of the continental seasonal climate to initial conditions is estimated from an ensemble of decadal simulations of an atmospheric general circulation model with the same specifications of radiative forcings and monthly ocean boundary conditions, but with different initial states of atmosphere and land. As measures of the ''reproducibility'' of continental climate for different initial conditions, spatio-temporal correlations are computed across paired realizations of eleven model land-surface variables in which the seasonal cycle is either included or excluded--the former case being pertinent to climate simulation, and the latter to seasonal anomaly prediction. It is found that the land-surface variables which include the seasonal cycle are impacted only marginally by changes in initial conditions; moreover, their seasonal climatologies exhibit high spatial reproducibility. In contrast, the reproducibility of a seasonal land-surface anomaly is generally low, although it is substantially higher in the Tropics; its spatial reproducibility also markedly fluctuates in tandem with warm and cold phases of the El Nino/Southern Oscillation. However, the overall degree of reproducibility depends strongly on the particular land-surface anomaly considered. It is also shown that the predictability of a land-surface anomaly implied by its reproducibility statistics is consistent with what is inferred from more conventional predictability metrics. Implications of these results for climate model intercomparison projects and for operational forecasts of seasonal continental climate also are elaborated.

  2. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    , as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...... from the quantified attributes predict overall preference well. The findings allow for some generalizations within musical program genres regarding the perception of and preference for certain spatial reproduction modes, but for limited generalizations across selections from different musical genres....

  3. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  4. Filters, reproducing kernel, and adaptive meshfree method

    Science.gov (United States)

    You, Y.; Chen, J.-S.; Lu, H.

    Reproducing kernel, with its intrinsic feature of moving averaging, can be utilized as a low-pass filter with scale decomposition capability. The discrete convolution of two nth order reproducing kernels with arbitrary support size in each kernel results in a filtered reproducing kernel function that has the same reproducing order. This property is utilized to separate the numerical solution into an unfiltered lower order portion and a filtered higher order portion. As such, the corresponding high-pass filter of this reproducing kernel filter can be used to identify the locations of high gradient, and consequently serves as an operator for error indication in meshfree analysis. In conjunction with the naturally conforming property of the reproducing kernel approximation, a meshfree adaptivity method is also proposed.

  5. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy sou....... The limited reproducibility may be caused by variability in the preculture, or more precisely, variations in the physiological state of the bacteria in the precultures just before used as inoculum....

  6. Broad-spectrum antiviral agents

    Directory of Open Access Journals (Sweden)

    Jun-Da eZhu

    2015-05-01

    Full Text Available Development of highly effective, broad-spectrum antiviral agents is the major objective shared by the fields of virology and pharmaceutics. Antiviral drug development has focused on targeting viral entry and replication, as well as modulating cellular defense system. High throughput screening of molecules, genetic engineering of peptides, and functional screening of agents have identified promising candidates for development of optimal broad-spectrum antiviral agents to intervene in viral infection and control viral epidemics. This review discusses current knowledge, prospective applications, opportunities, and challenges in the development of broad-spectrum antiviral agents.

  7. Upregulated expression of La ribonucleoprotein domain family member 6 and collagen type I gene following water-filtered broad-spectrum near-infrared irradiation in a 3-dimensional human epidermal tissue culture model as revealed by microarray analysis.

    Science.gov (United States)

    Tanaka, Yohei; Nakayama, Jun

    2017-02-27

    Water-filtered broad-spectrum near-infrared irradiation can induce various biological effects, as our previous clinical, histological, and biochemical investigations have shown. However, few studies that examined the changes thus induced in gene expression. The aim was to investigate the changes in gene expression in a 3-dimensional reconstructed epidermal tissue culture exposed to water-filtered broad-spectrum near-infrared irradiation. DNA microarray and quantitative real-time polymerase chain reaction (PCR) analysis was used to assess gene expression levels in a 3-dimensional reconstructed epidermal model composed of normal human epidermal cells exposed to water-filtered broad-spectrum near-infrared irradiation. The water filter allowed 1000-1800 nm wavelengths and excluded 1400-1500 nm wavelengths, and cells were exposed to 5 or 10 rounds of near-infrared irradiation at 10 J/cm(2) . A DNA microarray with over 50 000 different probes showed 18 genes that were upregulated or downregulated by at least twofold after irradiation. Quantitative real-time PCR revealed that, relative to control cells, the gene encoding La ribonucleoprotein domain family member 6 (LARP6), which regulates collagen expression, was significantly and dose-dependently upregulated (P < 0.05) by water-filtered broad-spectrum near-infrared exposure. Gene encoding transcripts of collagen type I were significantly upregulated compared with controls (P < 0.05). This study demonstrates the ability of water-filtered broad-spectrum near-infrared irradiation to stimulate the production of type I collagen. © 2017 The Australasian College of Dermatologists.

  8. Broad resonances in transport theory

    CERN Document Server

    Leupold, S

    2003-01-01

    The extension of the transport theoretical framework to include states with a broad mass distribution is discussed. The proper life-time and cross sections for a state with an arbitrarily given invariant mass is discussed in detail. (author)

  9. Reproducing Kernel Particle Method for Non-Linear Fracture Analysis

    Institute of Scientific and Technical Information of China (English)

    Cao Zhongqing; Zhou Benkuan; Chen Dapeng

    2006-01-01

    To study the non-linear fracture, a non-linear constitutive model for piezoelectric ceramics was proposed, in which the polarization switching and saturation were taken into account. Based on the model, the non-linear fracture analysis was implemented using reproducing kernel particle method (RKPM). Using local J-integral as a fracture criterion, a relation curve of fracture loads against electric fields was obtained. Qualitatively, the curve is in agreement with the experimental observations reported in literature. The reproducing equation, the shape function of RKPM, and the transformation method to impose essential boundary conditions for meshless methods were also introduced. The computation was implemented using object-oriented programming method.

  10. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  11. Explorations in statistics: statistical facets of reproducibility.

    Science.gov (United States)

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  12. Conserved synthetic peptides from the hemagglutinin of influenza viruses induce broad humoral and T-cell responses in a pig model.

    Directory of Open Access Journals (Sweden)

    Júlia Vergara-Alert

    Full Text Available Outbreaks involving either H5N1 or H1N1 influenza viruses (IV have recently become an increasing threat to cause potential pandemics. Pigs have an important role in this aspect. As reflected in the 2009 human H1N1 pandemia, they may act as a vehicle for mixing and generating new assortments of viruses potentially pathogenic to animals and humans. Lack of universal vaccines against the highly variable influenza virus forces scientists to continuously design vaccines à la carte, which is an expensive and risky practice overall when dealing with virulent strains. Therefore, we focused our efforts on developing a broadly protective influenza vaccine based on the Informational Spectrum Method (ISM. This theoretical prediction allows the selection of highly conserved peptide sequences from within the hemagglutinin subunit 1 protein (HA1 from either H5 or H1 viruses which are located in the flanking region of the HA binding site and with the potential to elicit broader immune responses than conventional vaccines. Confirming the theoretical predictions, immunization of conventional farm pigs with the synthetic peptides induced humoral responses in every single pig. The fact that the induced antibodies were able to recognize in vitro heterologous influenza viruses such as the pandemic H1N1 virus (pH1N1, two swine influenza field isolates (SwH1N1 and SwH3N2 and a H5N1 highly pathogenic avian virus, confirm the broad recognition of the antibodies induced. Unexpectedly, all pigs also showed T-cell responses that not only recognized the specific peptides, but also the pH1N1 virus. Finally, a partial effect on the kinetics of virus clearance was observed after the intranasal infection with the pH1N1 virus, setting forth the groundwork for the design of peptide-based vaccines against influenza viruses. Further insights into the understanding of the mechanisms involved in the protection afforded will be necessary to optimize future vaccine formulations.

  13. Dual action of phosphonate herbicides in plants affected by herbivore--model study on black bean aphid Aphis fabae rearing on broad bean Vicia faba plants.

    Science.gov (United States)

    Lipok, Jacek

    2009-09-01

    The interactions between plants, herbicides and herbivore insects were studied as an aspect of possible side effect of the using of phosphonate herbicides. The experimental system was composed of phosphonate herbicides, broad bean Vicia faba (L.) plants and black bean aphid Aphis fabae (Scopoli). Two means of herbicide application, namely standard spraying and direct introduction of the herbicide into stem via glass capillary, were examined. The results obtained for N-2-piridylaminomethylene bisphosphonic acid and its derivatives show 10 times higher inhibition of the plant growth if glass capillary mode was used. When plants were infested by aphids 24h after the use of herbicide, a significant decrease in plant growth rate was observed in relation to plants treated with herbicides alone. Moreover, the sensitivity of aphids towards glyphosate, N-2-piridylaminomethylene bisphosphonic acid and its 3-methyl derivative introduced to artificial diet indicated that these herbicidal phosphonates possessed also insecticidal activity if applied in a systemic manner. Additionally, olfactometer measurements revealed that aphids preferred intact V. faba leaves over those that had been treated with sublethal doses of herbicides. The results achieved in these experiments indicate that the use of phosphonate herbicides decreases plant resistance and influences the number of aphids accompanied with treated plants. Regarding these facts it can be concluded that the combined effect of herbicide-induced stress and insect herbivory reduced plant fitness and thus should be considered as also a factor enabling the reduction of herbicide doses.

  14. Giant Broad Line Regions in Dwarf Seyferts

    Indian Academy of Sciences (India)

    Nick Devereux

    2015-12-01

    High angular resolution spectroscopy obtained with the Hubble Space Telescope (HST) has revealed a remarkable population of galaxies hosting dwarf Seyfert nuclei with an unusually large broad-line region (BLR). These objects are remarkable for two reasons. Firstly, the size of the BLR can, in some cases, rival those seen in the most luminous quasars. Secondly, the size of the BLR is not correlated with the central continuum luminosity, an observation that distinguishes them from their reverberating counterparts. Collectively, these early results suggest that non-reverberating dwarf Seyferts are a heterogeneous group, and not simply scaled versions of each other. Careful inspection reveals broad H Balmer emission lines with single peaks, double peaks, and a combination of the two, suggesting that the broad emission lines are produced in kinematically distinct regions centered on the black hole (BH). Because the gravitational field strength is already known for these objects, by virtue of knowing their BH mass, the relationship between velocity and radius may be established, given a kinematic model for the BLR gas. In this way, one can determine the inner and outer radii of the BLRs by modeling the shape of their broad emission line profiles. In the present contribution, high quality spectra obtained with the Space Telescope Imaging Spectrograph (STIS) are used to constrain the size of the BLR in the dwarf Seyfert nuclei of M81, NGC 3998, NGC 4203, NGC 3227, NGC 4051 and NGC 3516.

  15. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  16. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  17. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  18. A Model of Compound Heterozygous, Loss-of-Function Alleles Is Broadly Consistent with Observations from Complex-Disease GWAS Datasets.

    Directory of Open Access Journals (Sweden)

    Jaleal S Sanjak

    2017-01-01

    Full Text Available The genetic component of complex disease risk in humans remains largely unexplained. A corollary is that the allelic spectrum of genetic variants contributing to complex disease risk is unknown. Theoretical models that relate population genetic processes to the maintenance of genetic variation for quantitative traits may suggest profitable avenues for future experimental design. Here we use forward simulation to model a genomic region evolving under a balance between recurrent deleterious mutation and Gaussian stabilizing selection. We consider multiple genetic and demographic models, and several different methods for identifying genomic regions harboring variants associated with complex disease risk. We demonstrate that the model of gene action, relating genotype to phenotype, has a qualitative effect on several relevant aspects of the population genetic architecture of a complex trait. In particular, the genetic model impacts genetic variance component partitioning across the allele frequency spectrum and the power of statistical tests. Models with partial recessivity closely match the minor allele frequency distribution of significant hits from empirical genome-wide association studies without requiring homozygous effect sizes to be small. We highlight a particular gene-based model of incomplete recessivity that is appealing from first principles. Under that model, deleterious mutations in a genomic region partially fail to complement one another. This model of gene-based recessivity predicts the empirically observed inconsistency between twin and SNP based estimated of dominance heritability. Furthermore, this model predicts considerable levels of unexplained variance associated with intralocus epistasis. Our results suggest a need for improved statistical tools for region based genetic association and heritability estimation.

  19. Reproducibility Experiment of OSL and TL Dosimeter

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>Reproducibility is an important property of personal dosimeter. It not only can indicate the stability of dosimeter, appraise the precision and accuracy of measured value, but also can evaluate the

  20. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using Open......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  1. Reproducibility of AMPLICOR enterovirus PCR test results.

    OpenAIRE

    1997-01-01

    The reproducibility of AMPLICOR enterovirus PCR test results was determined with clinical samples of cerebrospinal fluid, serum, urine, and throat and rectal swabs. Among 608 samples from which duplicate aliquots were run simultaneously, only seven pairs gave discordant results. Among 104 samples from which duplicate aliquots were run in separate assays, no discordance was seen. Overall, the reproducibility of test kit results was 99% (705 of 712).

  2. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  3. The Broad Autism Phenotype Questionnaire

    Science.gov (United States)

    Hurley, Robert S. E.; Losh, Molly; Parlier, Morgan; Reznick, J. Steven; Piven, Joseph

    2007-01-01

    The broad autism phenotype (BAP) is a set of personality and language characteristics that reflect the phenotypic expression of the genetic liability to autism, in non-autistic relatives of autistic individuals. These characteristics are milder but qualitatively similar to the defining features of autism. A new instrument designed to measure the…

  4. Analysis of shallow-water experimental acoustic data including a comparison with a broad-band normal-mode-propagation model

    NARCIS (Netherlands)

    Simons, D.G.; McHugh, R.; Snellen, M.; McCormick, N.H.; Lawson, E.A.

    2001-01-01

    Channel temporal variability, resulting from fluctuations in oceanographic parameters, is an important issue for reliable communications in shallow-water-long-range acoustic propagation. As part of an acoustic model validation exercise, audio-band acoustic data and oceanographic data were collected

  5. A gap-filling model for eddy covariance latent heat flux: Estimating evapotranspiration of a subtropical seasonal evergreen broad-leaved forest as an example

    Science.gov (United States)

    Chen, Yi-Ying; Chu, Chia-Ren; Li, Ming-Hsu

    2012-10-01

    SummaryIn this paper we present a semi-parametric multivariate gap-filling model for tower-based measurement of latent heat flux (LE). Two statistical techniques, the principal component analysis (PCA) and a nonlinear interpolation approach were integrated into this LE gap-filling model. The PCA was first used to resolve the multicollinearity relationships among various environmental variables, including radiation, soil moisture deficit, leaf area index, wind speed, etc. Two nonlinear interpolation methods, multiple regressions (MRS) and the K-nearest neighbors (KNNs) were examined with random selected flux gaps for both clear sky and nighttime/cloudy data to incorporate into this LE gap-filling model. Experimental results indicated that the KNN interpolation approach is able to provide consistent LE estimations while MRS presents over estimations during nighttime/cloudy. Rather than using empirical regression parameters, the KNN approach resolves the nonlinear relationship between the gap-filled LE flux and principal components with adaptive K values under different atmospheric states. The developed LE gap-filling model (PCA with KNN) works with a RMSE of 2.4 W m-2 (˜0.09 mm day-1) at a weekly time scale by adding 40% artificial flux gaps into original dataset. Annual evapotranspiration at this study site were estimated at 736 mm (1803 MJ) and 728 mm (1785 MJ) for year 2008 and 2009, respectively.

  6. Superheating and melting within aluminum core-oxide shell nanoparticles for a broad range of heating rates: multiphysics phase field modeling.

    Science.gov (United States)

    Hwang, Yong Seok; Levitas, Valery I

    2016-10-19

    The external surface of metallic particles is usually covered by a thin and strong oxide shell, which significantly affects superheating and melting of particles. The effects of geometric parameters and heating rate on characteristic melting and superheating temperatures and melting behavior of aluminum nanoparticles covered by an oxide shell were studied numerically. For this purpose, the multiphysics model that includes the phase field model for surface melting, a dynamic equation of motion, a mechanical model for stress and strain simulations, interface and surface stresses, and the thermal conduction model including thermoelastic and thermo-phase transformation coupling as well as transformation dissipation rate was formulated. Several nontrivial phenomena were revealed. In comparison with a bare particle, the pressure generated in a core due to different thermal expansions of the core and shell and transformation volumetric expansion during melting, increases melting temperatures with the Clausius-Clapeyron factor of 60 K GPa(-1). For the heating rates Q ≤ 10(9) K s(-1), melting temperatures (surface and bulk start and finish melting temperatures, and maximum superheating temperature) are independent of Q. For Q ≥ 10(12) K s(-1), increasing Q generally increases melting temperatures and temperature for the shell fracture. Unconventional effects start for Q ≥ 10(12) K s(-1) due to kinetic superheating combined with heterogeneous melting and geometry. The obtained results are applied to shed light on the initial stage of the melt-dispersion-mechanism of the reaction of Al nanoparticles. Various physical phenomena that promote or suppress melting and affect melting temperatures and temperature of the shell fracture for different heating-rate ranges are summarized in the corresponding schemes.

  7. Broad bandwidth or high fidelity? Evidence from the structure of genetic and environmental effects on the facets of the five factor model.

    Science.gov (United States)

    Briley, Daniel A; Tucker-Drob, Elliot M

    2012-09-01

    The Five Factor Model of personality is well-established at the phenotypic level, but much less is known about the coherence of the genetic and environmental influences within each personality domain. Univariate behavioral genetic analyses have consistently found the influence of additive genes and nonshared environment on multiple personality facets, but the extent to which genetic and environmental influences on specific facets reflect more general influences on higher order factors is less clear. We applied a multivariate quantitative-genetic approach to scores on the CPI-Big Five facets for 490 monozygotic and 317 dizygotic twins who took part in the National Merit Twin Study. Our results revealed a complex genetic structure for facets composing all five factors, with both domain-general and facet-specific genetic and environmental influences. For three of the Big Five domains, models that required common genetic and environmental influences on each facet to occur by way of effects on a higher order trait did not fit as well as models allowing for common genetic and environmental effects to act directly on the facets. These results add to the growing body of literature indicating that important variation in personality occurs at the facet level which may be overshadowed by aggregating to the trait level. Research at the facet level, rather than the factor level, is likely to have pragmatic advantages in future research on the genetics of personality.

  8. Reproducibility in Data-Scarce Environments

    Science.gov (United States)

    Darch, P. T.

    2016-12-01

    Among the usual requirements for reproducibility are large volumes of data and computationally intensive methods. Many fields within earth sciences, however, do not meet these requirements. Data are scarce and data-intensive methods are not well established. How can science be reproducible under these conditions? What changes, both infrastructural and cultural, are needed to advance reproducibility? This paper presents findings from a long-term social scientific case study of an emergent and data scarce field, the deep subseafloor biosphere. This field studies interactions between microbial communities living in the seafloor and the physical environments they inhabit. Factors such as these make reproducibility seem a distant goal for this community: - The relative newness of the field. Serious study began in the late 1990s; - The highly multidisciplinary nature of the field. Researchers come from a range of physical and life science backgrounds; - Data scarcity. Domain researchers produce much of these data in their own onshore laboratories by analyzing cores from international ocean drilling expeditions. Allocation of cores is negotiated between researchers from many fields. These factors interact in multiple ways to inhibit reproducibility: - Incentive structures emphasize producing new data and new knowledge rather than reanalysing extant data; - Only a few steps of laboratory analyses can be reproduced - such as analysis of DNA sequences, but not extraction of DNA from cores -, due to scarcity of cores; - Methodological heterogeneity is a consequence of multidisciplinarity, as researchers bring different techniques from diverse fields. - Few standards for data collection or analysis are available at this early stage of the field; - While datasets from multiple biological and physical phenomena can be integrated into a single workflow, curation tends to be divergent. Each type of dataset may be subject to different disparate policies and contributed to different

  9. Data Science Innovations That Streamline Development, Documentation, Reproducibility, and Dissemination of Models in Computational Thermodynamics: An Application of Image Processing Techniques for Rapid Computation, Parameterization and Modeling of Phase Diagrams

    Science.gov (United States)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) represents a collection of numerical techniques that are used to calculate quantitative results from thermodynamic theory. In the Earth sciences, CT is most often applied to estimate the equilibrium properties of solutions, to calculate phase equilibria from models of the thermodynamic properties of materials, and to approximate irreversible reaction pathways by modeling these as a series of local equilibrium steps. The thermodynamic models that underlie CT calculations relate the energy of a phase to temperature, pressure and composition. These relationships are not intuitive and they are seldom well constrained by experimental data; often, intuition must be applied to generate a robust model that satisfies the expectations of use. As a consequence of this situation, the models and databases the support CT applications in geochemistry and petrology are tedious to maintain as new data and observations arise. What is required to make the process more streamlined and responsive is a computational framework that permits the rapid generation of observable outcomes from the underlying data/model collections, and importantly, the ability to update and re-parameterize the constitutive models through direct manipulation of those outcomes. CT procedures that take models/data to the experiential reference frame of phase equilibria involve function minimization, gradient evaluation, the calculation of implicit lines, curves and surfaces, contour extraction, and other related geometrical measures. All these procedures are the mainstay of image processing analysis. Since the commercial escalation of video game technology, open source image processing libraries have emerged (e.g., VTK) that permit real time manipulation and analysis of images. These tools find immediate application to CT calculations of phase equilibria by permitting rapid calculation and real time feedback between model outcome and the underlying model parameters.

  10. Uncertainties of isoprene emissions in the MEGAN model estimated for a coniferous and broad-leaved mixed forest in Southern China

    Energy Technology Data Exchange (ETDEWEB)

    Situ, S.; Wang, Xuemei; Guenther, Alex B.; Zhang, Yanli; Wang, Xinming; Huang, Minjuan; Fan, Qi; Xiong, Zhe

    2014-12-01

    Using local observed emission factor, meteorological data, vegetation 5 information and dynamic MODIS LAI, MEGANv2.1 was constrained to predict the isoprene emission from Dinghushan forest in the Pearl River Delta region during a field campaign in November 2008, and the uncertainties in isoprene emission estimates were quantified by the Monte Carlo approach. The results indicate that MEGAN can predict the isoprene emission reasonably during the campaign, and the mean value of isoprene emission is 2.35 mg m-2 h-1 in daytime. There are high uncertainties associated with the MEGAN inputs and calculated parameters, and the relative error can be as high as -89 to 111% for a 95% confidence interval. The emission factor of broadleaf trees and the activity factor accounting for light and temperature dependence are the most important contributors to the uncertainties in isoprene emission estimated for the Dinghushan forest during the campaign. The results also emphasize the importance of accurate observed PAR and temperature to reduce the uncertainties in isoprene emission estimated by model, because the MEGAN model activity factor accounting for light and temperature dependence is highly sensitive to PAR and temperature.

  11. Reproducibility of psychophysics and electroencephalography during offset analgesia.

    Science.gov (United States)

    Nilsson, M; Piasco, A; Nissen, T D; Graversen, C; Gazerani, P; Lucas, M-F; Dahan, A; Drewes, A M; Brock, C

    2014-07-01

    Offset analgesia (OA) is a pain-inhibiting mechanism, defined as a disproportionately large decrease in pain perception in response to a discrete decrease in noxious stimulus intensity. Hence, the aims were (1) to investigate whether psychophysics and electroencephalography (EEG) can be assessed simultaneously during OA and (2) to assess whether OA is reproducible within the same day as well as between different days. Two separate studies investigated OA: Study I (13 healthy volunteers; seven men; 25.5 ± 0.65 years) aimed at determining the feasibility of recording psychophysics and EEG simultaneously during OA. Study II (18 healthy volunteers; 12 men; 34 ± 3.15 years) assessed reproducibility of OA in terms of psychophysics and EEG. Subjects were presented to a 30-s OA heat stimulus paradigm on the volar forearm and psychophysics, and EEG recordings were obtained throughout the procedure. Reproducibility was assessed within the same day and between different days, using intraclass correlation coefficients (ICCs). Additionally, the reproducible psychophysical parameters were correlated to relevant EEG frequency bands. Simultaneous recording of psychophysics and EEG affects the frequency distribution in terms of alpha suppression. Reproducibility was proven for the psychophysics and EEG frequency bands both within the same day (all ICCs > 0.62) and between different days (all ICCs > 0.66, except for the delta band). Correlations between psychophysics and EEG were found in the theta (4-8 Hz), alpha (8-12 Hz) and gamma (32-80 Hz) bands (all p < 0.01). OA is a robust and reproducible model for experimental pain research, making it suitable for future research. © 2013 European Pain Federation - EFIC®

  12. Predicting land use change on a broad area: Dyna-CLUE model application to the Litorale Domizio-Agro Aversano (Campania, South Italy

    Directory of Open Access Journals (Sweden)

    Stefania Pindozzi

    2017-06-01

    Full Text Available The long-standing awareness of the environmental impact of land-use change (LUC has led scientific community to develop tools able to predict their amount and to evaluate their effect on environment, with the aim supporting policy makers in their planning activities. This paper proposes an implementation of the Dyna-CLUE (Dynamic Conversion of Land Use and its Effects model applied to the Litorale Domizio-Agro Aversano, an area of Campania region, which needs interventions for environmental remediation. Future land use changes were simulated in two different scenarios developed under alternative strategies of land management: scenario 1 is a simple projection of the recent LUC trend, while scenario 2 hypothesises the introduction of no-food crops, such as poplar (Populus nigra L. and giant reed (Arundo donax L., in addition to a less impactful urban sprawl, which is one of the main issues in the study area. The overall duration of simulations was 13 years, subdivided into yearly time steps. CORINE land cover map of 2006 was used as baseline for land use change detection in the study area. Competition between different land use types is taken into account by setting the conversion elasticity, a parameter ranging from 0 to 1, according to their capital investment level. Location suitability for each land use type is based on logit model. Since no actual land use already exists for the alternative crops investigated in scenario 2, a suitability map realised through a spatial multicriteria decision analysis was used as a proxy for its land use pattern. The comparison of the land use in 2012 and scenario 1, evaluated through the application of Kappa statistics, showed a general tendency to expansion of built-up areas, with an increase of about 2400 ha (1.5% of the total surface, at the expense of agricultural land and those covered by natural vegetation. The comparison of the land use in 2012 and scenario 2 showed a less significant spread of built

  13. Microsatellite diversity and broad scale geographic structure in a model legume: building a set of nested core collection for studying naturally occurring variation in Medicago truncatula

    DEFF Research Database (Denmark)

    Ronfort, Joelle; Bataillon, Thomas; Santoni, Sylvain

    2006-01-01

    scheme. Conclusion The stratification inferred is discussed considering potential historical events like expansion, refuge history and admixture between neighbouring groups. Information on the allelic richness and the inferred population structure are used to build a nested core-collection. The set......Abstract               Acknowledgements References   Background Exploiting genetic diversity requires previous knowledge of the extent and structure of the variation occurring in a species. Such knowledge can in turn be used to build a core-collection, i.e. a subset of accessions that aim...... at representing the genetic diversity of this species with a minimum of repetitiveness. We investigate the patterns of genetic diversity and population structure in a collection of 346 inbred lines representing the breadth of naturally occurring diversity in the Legume plant model Medicago truncatula using 13...

  14. Reproducing Kernel for D2(Ω, ρ) and Metric Induced by Reproducing Kernel

    Institute of Scientific and Technical Information of China (English)

    ZHAO Zhen Gang

    2009-01-01

    An important property of the reproducing kernel of D2(Ω, ρ) is obtained and the reproducing kernels for D2(Ω, ρ) are calculated when Ω = Bn × Bn and ρ are some special functions. A reproducing kernel is used to construct a semi-positive definite matrix and a distance function defined on Ω×Ω. An inequality is obtained about the distance function and the pseudodistance induced by the matrix.

  15. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  16. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    concentration of the oil in water-based cutting fluid (or when using a straight mineral oil) results in surface profiles that are more reproducible at higher cutting speed. Moreover, it can be seen that three cutting fluids (two water-based cutting fluids with different oil concentration and a straight mineral......An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluated...... oil) used in connection with a low cutting speed result in "identical" surface profiles. Biggest uncertainty contributors were due to the process repeatability and repeatability around the hole circumference. This was however only in the case of high cutting speeds and low degree of oil concentration...

  17. Estimation of the effect of the degree of sewage treatment on the status of pollution along the coastline of the Mediterranean Sea using broad scale modelling.

    Science.gov (United States)

    Stamou, Anastasios I; Kamizoulis, George

    2009-02-01

    A preliminary investigation was performed to estimate the effect of the degree of treatment in Sewage Treatment Plants (STPs) on the status of pollution along the coastlines of the Mediterranean Sea. Data from questionnaires and the literature were collected and processed (a) to identify 18 approximate 1D surface coastal currents, (b) to estimate their prevailing direction and average flow velocity and (c) to estimate the water pollution loads and identify their locations of discharge. Then, a simplified 1D water quality model was formulated and applied for the existing conditions and two hypothetical scenarios: (1) all coastal cities have STPs with secondary treatment and (2) all coastal cities have STPs with tertiary treatment to determine BOD(5), TN and TP concentrations in the 18 surface coastal currents. Calculated concentrations were compared and discussed. A main conclusion is that, to reduce pollution in the Mediterranean Sea measures should be adopted for upgrading the water quality of the rivers discharging into the Mediterranean Sea, along with the construction of STPs for all the coastal cities.

  18. Vaccines directed against microorganisms or their products present during biofilm lifestyle: can we make a translation as a broad biological model to tuberculosis?

    Directory of Open Access Journals (Sweden)

    Mario Alberto eFlores-Valdez

    2016-01-01

    Full Text Available Tuberculosis (TB remains as a global public health problem. In recent years, experimental evidence suggesting the relevance of in vitro pellicle (a type of biofilm formed at the air-liquid interface production as a phenotype mimicking aspects found by M. tuberculosis-complex bacteria during in vivo infection has started to accumulate. There are still opportunities for better diagnostic tools, therapeutic molecules as well as new vaccine candidates to assist in TB control programs worldwide and particularly in less developed nations. Regarding vaccines, despite the availability of a live, attenuated strain (M. bovis BCG since almost a century ago, its variable efficacy and lack of protection against pulmonary and latent disease has prompted basic and applied research leading to preclinical and clinical evaluation of up to 15 new candidates. In this work, I present examples of vaccines based on whole cells grown as biofilms, or specific proteins expressed under such condition, and the effect they have shown in relevant animal models or directly in the natural host. I also discuss why it might be worthwhile to explore these approaches, for constructing and developing new vaccine candidates for testing their efficacy against TB.

  19. Reproducibility of operator processing for radiation dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Sui Shen; DeNardo, Gerald L.; DeNardo, Sally J.; Aina, Yuan; DeNardo, Diane A.; Lamborn, Kathleen R

    1997-01-01

    Reproducibility of operator processing for radiation dose and biological half-life was assessed for radioimmunotherapy. Mean coefficient of variation for intra-operator consecutive processing and for inter-operator processing was less than 15% for all tissues. The mean coefficient of variation for intra-operator processing over 2 wk or inter-operator processing comparing an experienced and less experienced operator was generally greater, and particularly so for tumors. Satisfactory reproducibility was achievable using visual determination of regions of interests after 80 h of training.

  20. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...... Network online repositories, an open-source data archiving project sponsored by Harvard University. In this article I review the importance of data archiving in the context of reproducible research, introduce the Dataverse Network, explain the implementation of the dvn package, and provide example code...... for archiving and releasing data using the package....

  1. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...... Network online repositories, an open-source data archiving project sponsored by Harvard University. In this article I review the importance of data archiving in the context of reproducible research, introduce the Dataverse Network, explain the implementation of the dvn package, and provide example code...

  2. A Mixed-Methods Trial of Broad Band Noise and Nature Sounds for Tinnitus Therapy: Group and Individual Responses Modeled under the Adaptation Level Theory of Tinnitus.

    Science.gov (United States)

    Durai, Mithila; Searchfield, Grant D

    2017-01-01

    Objectives: A randomized cross-over trial in 18 participants tested the hypothesis that nature sounds, with unpredictable temporal characteristics and high valence would yield greater improvement in tinnitus than constant, emotionally neutral broadband noise. Study Design: The primary outcome measure was the Tinnitus Functional Index (TFI). Secondary measures were: loudness and annoyance ratings, loudness level matches, minimum masking levels, positive and negative emotionality, attention reaction and discrimination time, anxiety, depression and stress. Each sound was administered using MP3 players with earbuds for 8 continuous weeks, with a 3 week wash-out period before crossing over to the other treatment sound. Measurements were undertaken for each arm at sound fitting, 4 and 8 weeks after administration. Qualitative interviews were conducted at each of these appointments. Results: From a baseline TFI score of 41.3, sound therapy resulted in TFI scores at 8 weeks of 35.6; broadband noise resulted in significantly greater reduction (8.2 points) after 8 weeks of sound therapy use than nature sounds (3.2 points). The positive effect of sound on tinnitus was supported by secondary outcome measures of tinnitus, emotion, attention, and psychological state, but not interviews. Tinnitus loudness level match was higher for BBN at 8 weeks; while there was little change in loudness level matches for nature sounds. There was no change in minimum masking levels following sound therapy administration. Self-reported preference for one sound over another did not correlate with changes in tinnitus. Conclusions: Modeled under an adaptation level theory framework of tinnitus perception, the results indicate that the introduction of broadband noise shifts internal adaptation level weighting away from the tinnitus signal, reducing tinnitus magnitude. Nature sounds may modify the affective components of tinnitus via a secondary, residual pathway, but this appears to be less important

  3. A Mixed-Methods Trial of Broad Band Noise and Nature Sounds for Tinnitus Therapy: Group and Individual Responses Modeled under the Adaptation Level Theory of Tinnitus

    Science.gov (United States)

    Durai, Mithila; Searchfield, Grant D.

    2017-01-01

    Objectives: A randomized cross-over trial in 18 participants tested the hypothesis that nature sounds, with unpredictable temporal characteristics and high valence would yield greater improvement in tinnitus than constant, emotionally neutral broadband noise. Study Design: The primary outcome measure was the Tinnitus Functional Index (TFI). Secondary measures were: loudness and annoyance ratings, loudness level matches, minimum masking levels, positive and negative emotionality, attention reaction and discrimination time, anxiety, depression and stress. Each sound was administered using MP3 players with earbuds for 8 continuous weeks, with a 3 week wash-out period before crossing over to the other treatment sound. Measurements were undertaken for each arm at sound fitting, 4 and 8 weeks after administration. Qualitative interviews were conducted at each of these appointments. Results: From a baseline TFI score of 41.3, sound therapy resulted in TFI scores at 8 weeks of 35.6; broadband noise resulted in significantly greater reduction (8.2 points) after 8 weeks of sound therapy use than nature sounds (3.2 points). The positive effect of sound on tinnitus was supported by secondary outcome measures of tinnitus, emotion, attention, and psychological state, but not interviews. Tinnitus loudness level match was higher for BBN at 8 weeks; while there was little change in loudness level matches for nature sounds. There was no change in minimum masking levels following sound therapy administration. Self-reported preference for one sound over another did not correlate with changes in tinnitus. Conclusions: Modeled under an adaptation level theory framework of tinnitus perception, the results indicate that the introduction of broadband noise shifts internal adaptation level weighting away from the tinnitus signal, reducing tinnitus magnitude. Nature sounds may modify the affective components of tinnitus via a secondary, residual pathway, but this appears to be less important

  4. Development of forest biodiversity evaluation index system for conifer and broad leaf mixed forest and model construction%针阔混交林生物多样性评价指标体系与模型构建

    Institute of Scientific and Technical Information of China (English)

    吴金卓; 彭萱亦; 林文树

    2015-01-01

    The conifer and broad leaf mixed forests at different succession stages of middle-aged forest, near-mature forest, mature forest, and old growth forest in Changbai Mountains, Zhangguangcai ridge of Jilin Province were studied in order to investigate the biodiversity related characteristics such as species composition, spatial relationship, dead wood, invasive species, and so on. Combined with the analytical status on biodiversity evaluation at home and abroad , an initial biodiversity evaluation index system with three levels was constructed for conifer and broad leaf mixed forest and a total of 25 representative 3 rd-level indices were selected. Principal component analysis was then used to screen the indices and the final evaluation index system with 17 evaluation indices of 3rd-level was obtained. The Analytical Hierarchical Process together with export scoring was applied to determine the weights of the evaluation indices and a biodiversity evaluation model for mixed conifer and broad leaf forest was finally constructed, which can provide basis for the evaluation of biodiversity in conifer and broad leaf mixed forest and the determination of related forest biodiversity protection policies.%以吉林省长白山系张广才岭阔叶红松林不同演替阶段群落(中龄林、近熟林、成熟林和老龄林)为研究对象,通过调查样地内生物多样性特征,如物种组成信息、空间位置、枯死木和外来入侵物种等情况,结合国内外生物多样性研究现状,初步构建1个3层次结构的针阔混交林生物多样性评价指标体系. 选取有代表性的三级评价指标25项,采用主成分分析方法进行指标筛选,最终得到由17项三级指标构成的针阔混交林生物多样性评价指标体系;利用层次分析法结合专家打分法对最终的评价指标进行权重确定,构建出针阔混交林生物多样性评价模型.

  5. Advances in Global Magnetosphere Modeling at the Community Coordinated Modeling Center.

    Science.gov (United States)

    Kuznetsova, Maria

    2016-07-01

    The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art global magnetosphere models that are capable to reproduce a broad range of physical phenomena in Earth's magnetosphere. We will discuss successes and challenges in global magnetosphere modeling and the role of non-MHD effects on global dynamics.

  6. Measurement of Liver Iron Concentration by MRI Is Reproducible

    Directory of Open Access Journals (Sweden)

    José María Alústiza

    2015-01-01

    Full Text Available Purpose. The objectives were (i construction of a phantom to reproduce the behavior of iron overload in the liver by MRI and (ii assessment of the variability of a previously validated method to quantify liver iron concentration between different MRI devices using the phantom and patients. Materials and Methods. A phantom reproducing the liver/muscle ratios of two patients with intermediate and high iron overload. Nine patients with different levels of iron overload were studied in 4 multivendor devices and 8 of them were studied twice in the machine where the model was developed. The phantom was analysed in the same equipment and 14 times in the reference machine. Results. FeCl3 solutions containing 0.3, 0.5, 0.6, and 1.2 mg Fe/mL were chosen to generate the phantom. The average of the intramachine variability for patients was 10% and for the intermachines 8%. For the phantom the intramachine coefficient of variation was always below 0.1 and the average of intermachine variability was 10% for moderate and 5% for high iron overload. Conclusion. The phantom reproduces the behavior of patients with moderate or high iron overload. The proposed method of calculating liver iron concentration is reproducible in several different 1.5 T systems.

  7. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Rep

  8. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  9. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D. Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Della Penna, Nicolas; den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernandez-Castilla, Belen; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Gloeckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O. -Y.; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jaekel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knezevic, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniel; Lane, Kristin A.; Lassetter, Bethany; Lazarevic, Ljiljana B.; LeBel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, M.; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; Mackinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Moeschl, Marcus; Motyl, Matt; Mueller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michele B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima-Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schmidt, Kathleen; Schlegelmilch, Rene; Seibel, Larissa; Scholz, Sabine; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L. M.; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valasek, Milan; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; van Bork, Riet; van de Ven, Mathijs; van den Bergh, Don; van der Hulst, Marije; van Dooren, Roel; van Doorn, Johnny; van Renswoude, Daan R.; van Rijn, Hedderik; Vanpaemel, Wolf; Echeverria, Alejandro Vasquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  10. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Rep

  11. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D. Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Della Penna, Nicolas; den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernandez-Castilla, Belen; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Gloeckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O. -Y.; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jaekel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knezevic, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniel; Lane, Kristin A.; Lassetter, Bethany; Lazarevic, Ljiljana B.; LeBel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, M.; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; Mackinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Moeschl, Marcus; Motyl, Matt; Mueller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michele B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima-Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schmidt, Kathleen; Schlegelmilch, Rene; Seibel, Larissa; Scholz, Sabine; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L. M.; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valasek, Milan; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; van Bork, Riet; van de Ven, Mathijs; van den Bergh, Don; van der Hulst, Marije; van Dooren, Roel; van Doorn, Johnny; van Renswoude, Daan R.; van Rijn, Hedderik; Vanpaemel, Wolf; Echeverria, Alejandro Vasquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric-Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  12. Multi-laboratory assessment of reproducibility, qualitative and quantitative performance of SWATH-mass spectrometry.

    Science.gov (United States)

    Collins, Ben C; Hunter, Christie L; Liu, Yansheng; Schilling, Birgit; Rosenberger, George; Bader, Samuel L; Chan, Daniel W; Gibson, Bradford W; Gingras, Anne-Claude; Held, Jason M; Hirayama-Kurogi, Mio; Hou, Guixue; Krisp, Christoph; Larsen, Brett; Lin, Liang; Liu, Siqi; Molloy, Mark P; Moritz, Robert L; Ohtsuki, Sumio; Schlapbach, Ralph; Selevsek, Nathalie; Thomas, Stefani N; Tzeng, Shin-Cheng; Zhang, Hui; Aebersold, Ruedi

    2017-08-21

    Quantitative proteomics employing mass spectrometry is an indispensable tool in life science research. Targeted proteomics has emerged as a powerful approach for reproducible quantification but is limited in the number of proteins quantified. SWATH-mass spectrometry consists of data-independent acquisition and a targeted data analysis strategy that aims to maintain the favorable quantitative characteristics (accuracy, sensitivity, and selectivity) of targeted proteomics at large scale. While previous SWATH-mass spectrometry studies have shown high intra-lab reproducibility, this has not been evaluated between labs. In this multi-laboratory evaluation study including 11 sites worldwide, we demonstrate that using SWATH-mass spectrometry data acquisition we can consistently detect and reproducibly quantify >4000 proteins from HEK293 cells. Using synthetic peptide dilution series, we show that the sensitivity, dynamic range and reproducibility established with SWATH-mass spectrometry are uniformly achieved. This study demonstrates that the acquisition of reproducible quantitative proteomics data by multiple labs is achievable, and broadly serves to increase confidence in SWATH-mass spectrometry data acquisition as a reproducible method for large-scale protein quantification.SWATH-mass spectrometry consists of a data-independent acquisition and a targeted data analysis strategy that aims to maintain the favorable quantitative characteristics on the scale of thousands of proteins. Here, using data generated by eleven groups worldwide, the authors show that SWATH-MS is capable of generating highly reproducible data across different laboratories.

  13. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  14. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  15. The case for inflow of the broad-line region of active galactic nuclei

    CERN Document Server

    Gaskell, C Martin

    2015-01-01

    The high-ionization lines of the broad-line region (BLR) of thermal active galactic nuclei (AGNs) show blueshifts of a few hundred km/s to several thousand km/sec with respect to the low-ionization lines. This has long been thought to be due to the high-ionization lines of the BLR arising in a wind of which the far side of the outflow is blocked from our view by the accretion disc. Evidence for and against the disc-wind model is discussed. The biggest problem for the model is that velocity-resolved reverberation mapping repeatedly fails to show the expected kinematic signature of outflow of the BLR. The disc-wind model also cannot readily reproduce the red side of the line profiles of high-ionization lines. The rapidly falling density in an outflow makes it difficult to obtain high equivalent widths. We point out a number of major problems with associating the BLR with the outflows producing broad absorption lines. An explanation which avoids all these problems and satisfies the constraints of both the line p...

  16. Reproducing entanglement through local classical resources with no communication

    CERN Document Server

    Di Lorenzo, Antonio

    2011-01-01

    Entanglement is one of the most intriguing features of quantum mechanics. It gives rise to peculiar correlations which cannot be reproduced by a large class of alternative theories, the so-called hidden-variable models, that use parameters in addition to the wave-function. This incompatibility was quantified through the celebrated Bell inequalities, and more recently through new inequalities due to Leggett. Experiments confirm the predictions of quantum mechanics. However, this does not imply that quantum mechanics is the ultimate theory, unsusceptible of improvement, nor that quantum mechanics is essentially non-local. The theories ruled out by Bell and Leggett inequalities are required to satisfy some hypotheses, none of which is implied by locality alone. By dropping one or more hypotheses, it is possible not only to violate said inequalities, but to reproduce the quantum mechanical predictions altogether. So far, the models proposed were only mathematical constructs. In this paper we provide a classical r...

  17. Broad-spectrum antiviral therapeutics.

    Directory of Open Access Journals (Sweden)

    Todd H Rider

    Full Text Available Currently there are relatively few antiviral therapeutics, and most which do exist are highly pathogen-specific or have other disadvantages. We have developed a new broad-spectrum antiviral approach, dubbed Double-stranded RNA (dsRNA Activated Caspase Oligomerizer (DRACO that selectively induces apoptosis in cells containing viral dsRNA, rapidly killing infected cells without harming uninfected cells. We have created DRACOs and shown that they are nontoxic in 11 mammalian cell types and effective against 15 different viruses, including dengue flavivirus, Amapari and Tacaribe arenaviruses, Guama bunyavirus, and H1N1 influenza. We have also demonstrated that DRACOs can rescue mice challenged with H1N1 influenza. DRACOs have the potential to be effective therapeutics or prophylactics for numerous clinical and priority viruses, due to the broad-spectrum sensitivity of the dsRNA detection domain, the potent activity of the apoptosis induction domain, and the novel direct linkage between the two which viruses have never encountered.

  18. Reply to the comment of S. Rayne on "QSAR model reproducibility and applicability: A case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles".

    Science.gov (United States)

    Gramatica, Paola; Kovarich, Simona; Roy, Partha Pratim

    2013-07-30

    We appreciate the interest of Dr. Rayne on our article and we completely agree that the dataset of (benzo-)triazoles, which were screened by the hydroxyl radical reaction quantitative structure-activity relationship (QSAR) model, was not only composed of benzo-triazoles but also included some simpler triazoles (without the condensed benzene ring), such as the chemicals listed by Dr. Rayne, as well as some related heterocycles (also few not aromatic). We want to clarify that in this article (as well as in other articles in which the same dataset was screened), for conciseness, the abbreviations (B)TAZs and BTAZs were used as general (and certainly too simplified) notations meaning an extended dataset of benzo-triazoles, triazoles, and related compounds. Copyright © 2013 Wiley Periodicals, Inc.

  19. Ghost imaging with broad distance

    Institute of Scientific and Technical Information of China (English)

    段德洋; 张路; 杜少将; 夏云杰

    2015-01-01

    We present a scheme that is able to achieve the ghost imaging with broad distance. The physical nature of our scheme is that the different wavelength beams are separated in free space by an optical media according to the slow light or dispersion principle. Meanwhile, the equality of the optical distance of the two light arms is not violated. The photon correlation is achieved by the rotating ground glass plate (RGGP) and spatial light modulator (SLM), respectively. Our work shows that a monochromic ghost image can be obtained in the case of RGGP. More importantly, the position (or distance) of the object can be ascertained by the color of the image. Thus, the imaging and ranging processes are combined as one process for the first time to the best of our knowledge. In the case of SLM, we can obtain a colored image regardless of where the object is.

  20. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  1. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  2. Reproducibility of electroretinograms recorded with DTL electrodes.

    Science.gov (United States)

    Hébert, M; Lachapelle, P; Dumont, M

    The purpose of this study was to examine whether the use of the DTL fiber electrode yields stable and reproducible electroretinographic recordings. To do so, luminance response function, derived from dark-adapted electroretinograms, was obtained from both eyes of 10 normal subjects at two recording sessions spaced by 7-14 days. The data thus generated was used to calculate Naka-Rushton Vmax and k parameters and values obtained at the two recording sessions were compared. Our results showed that there was no significant difference in the values of Vmax and k calculated from the data generated at the two recording sessions. The above clearly demonstrate that the use of the DTL fiber electrode does not jeopardize, in any way, the stability and reproducibility of ERG responses.

  3. Repeatability and Reproducibility of Virtual Subjective Refraction.

    Science.gov (United States)

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-10-01

    To establish the repeatability and reproducibility of a virtual refraction process using simulated retinal images. With simulation software, aberrated images corresponding with each step of the refraction process were calculated following the typical protocol of conventional subjective refraction. Fifty external examiners judged simulated retinal images until the best sphero-cylindrical refraction and the best visual acuity were achieved starting from the aberrometry data of three patients. Data analyses were performed to assess repeatability and reproducibility of the virtual refraction as a function of pupil size and aberrometric profile of different patients. SD values achieved in three components of refraction (M, J0, and J45) are lower than 0.25D in repeatability analysis. Regarding reproducibility, we found SD values lower than 0.25D in the most cases. When the results of virtual refraction with different pupil diameters (4 and 6 mm) were compared, the mean of differences (MoD) obtained were not clinically significant (less than 0.25D). Only one of the aberrometry profiles with high uncorrected astigmatism shows poor results for the M component in reproducibility and pupil size dependence analysis. In all cases, vision achieved was better than 0 logMAR. A comparison between the compensation obtained with virtual and conventional subjective refraction was made as an example of this application, showing good quality retinal images in both processes. The present study shows that virtual refraction has similar levels of precision as conventional subjective refraction. Moreover, virtual refraction has also shown that when high low order astigmatism is present, the refraction result is less precise and highly dependent on pupil size.

  4. Data Identifiers and Citations Enable Reproducible Science

    Science.gov (United States)

    Tilmes, C.

    2011-12-01

    Modern science often involves data processing with tremendous volumes of data. Keeping track of that data has been a growing challenge for data center. Researchers who access and use that data don't always reference and cite their data sources adequately for consumers of their research to follow their methodology or reproduce their analyses or experiments. Recent research has led to recommendations for good identifiers and citations that can help address this problem. This paper will describe some of the best practices in data identifiers, reference and citation. Using a simplified example scenario based on a long term remote sensing satellite mission, it will explore issues in identifying dynamic data sets and the importance of good data citations for reproducibility. It will describe the difference between granule and collection level identifiers, using UUIDs and DOIs to illustrate some recommendations for developing identifiers and assigning them during data processing. As data processors create data products, the provenance of the input products and precise steps that led to their creation are recorded and published for users of the data to see. As researchers access the data from an archive, they can use the provenance to help understand the genesis of the data, which could have effects on their usage of the data. By citing the data on publishing their research, others can retrieve the precise data used in their research and reproduce the analyses and experiments to confirm the results. Describing the experiment to a sufficient extent to reproduce the research enforces a formal approach that lends credibility to the results, and ultimately, to the policies of decision makers depending on that research.

  5. Tissue Doppler imaging reproducibility during exercise.

    Science.gov (United States)

    Bougault, V; Nottin, S; Noltin, S; Doucende, G; Obert, P

    2008-05-01

    Tissue Doppler imaging (TDI) is an echocardiographic technique used during exercising to improve the accuracy of a cardiovascular diagnostic. The validity of TDI requires its reproducibility, which has never been challenged during moderate to maximal intensity exercising. The present study was specifically designed to assess the transmitral Doppler and pulsed TDI reproducibility in 19 healthy men, who had undergone two identical semi-supine maximal exercise tests on a cycle ergometer. Systolic (S') and diastolic (E') tissue velocities at the septal and lateral walls as well as early transmitral velocities (E) were assessed during exercise up to maximal effort. The data were compared between the two tests at 40 %, 60 %, 80 % and 100 % of maximal aerobic power. Despite upper body movements and hyperventilation, good quality echocardiographic images were obtained in each case. Regardless of exercise intensity, no differences were noticed between the two tests for all measurements. The variation coefficients for Doppler variables ranged from 3 % to 9 % over the transition from rest to maximal exercise. The random measurement error was, on average, 5.8 cm/s for E' and 4.4 cm/s for S'. Overall, the reproducibility of TDI was acceptable. Tissue Doppler imaging can be used to accurately evaluate LV diastolic and/or systolic function for this range of exercise intensity.

  6. How to Write a Reproducible Paper

    Science.gov (United States)

    Irving, D. B.

    2016-12-01

    The geosciences have undergone a computational revolution in recent decades, to the point where almost all modern research relies heavily on software and code. Despite this profound change in the research methods employed by geoscientists, the reporting of computational results has changed very little in academic journals. This lag has led to something of a reproducibility crisis, whereby it is impossible to replicate and verify most of today's published computational results. While it is tempting to decry the slow response of journals and funding agencies in the face of this crisis, there are very few examples of reproducible research upon which to base new communication standards. In an attempt to address this deficiency, this presentation will describe a procedure for reporting computational results that was employed in a recent Journal of Climate paper. The procedure was developed to be consistent with recommended computational best practices and seeks to minimize the time burden on authors, which has been identified as the most important barrier to publishing code. It should provide a starting point for geoscientists looking to publish reproducible research, and could be adopted by journals as a formal minimum communication standard.

  7. Reproducibility of magnetic resonance perfusion imaging.

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    Full Text Available Dynamic MR biomarkers (T2*-weighted or susceptibility-based and T1-weighted or relaxivity-enhanced have been applied to assess tumor perfusion and its response to therapies. A significant challenge in the development of reliable biomarkers is a rigorous assessment and optimization of reproducibility. The purpose of this study was to determine the measurement reproducibility of T1-weighted dynamic contrast-enhanced (DCE-MRI and T2*-weighted dynamic susceptibility contrast (DSC-MRI with two contrast agents (CA of different molecular weight (MW: gadopentetate (Gd-DTPA, 0.5 kDa and Gadomelitol (P792, 6.5 kDa. Each contrast agent was tested with eight mice that had subcutaneous MDA-MB-231 breast xenograft tumors. Each mouse was imaged with a combined DSC-DCE protocol three times within one week to achieve measures of reproducibility. DSC-MRI results were evaluated with a contrast to noise ratio (CNR efficiency threshold. There was a clear signal drop (>95% probability threshold in the DSC of normal tissue, while signal changes were minimal or non-existent (<95% probability threshold in tumors. Mean within-subject coefficient of variation (wCV of relative blood volume (rBV in normal tissue was 11.78% for Gd-DTPA and 6.64% for P792. The intra-class correlation coefficient (ICC of rBV in normal tissue was 0.940 for Gd-DTPA and 0.978 for P792. The inter-subject correlation coefficient was 0.092. Calculated K(trans from DCE-MRI showed comparable reproducibility (mean wCV, 5.13% for Gd-DTPA, 8.06% for P792. ICC of K(trans showed high intra-subject reproducibility (ICC = 0.999/0.995 and inter-subject heterogeneity (ICC = 0.774. Histograms of K(trans distributions for three measurements had high degrees of overlap (sum of difference of the normalized histograms <0.01. These results represent homogeneous intra-subject measurement and heterogeneous inter-subject character of biological population, suggesting that perfusion MRI could be an imaging biomarker to

  8. Broad-band near-field ground motion simulations in 3-dimensional scattering media

    Science.gov (United States)

    Imperatori, W.; Mai, P. M.

    2013-02-01

    The heterogeneous nature of Earth's crust is manifested in the scattering of propagating seismic waves. In recent years, different techniques have been developed to include such phenomenon in broad-band ground-motion calculations, either considering scattering as a semi-stochastic or purely stochastic process. In this study, we simulate broad-band (0-10 Hz) ground motions with a 3-D finite-difference wave propagation solver using several 3-D media characterized by von Karman correlation functions with different correlation lengths and standard deviation values. Our goal is to investigate scattering characteristics and its influence on the seismic wavefield at short and intermediate distances from the source in terms of ground motion parameters. We also examine scattering phenomena, related to the loss of radiation pattern and the directivity breakdown. We first simulate broad-band ground motions for a point-source characterized by a classic ω2 spectrum model. Fault finiteness is then introduced by means of a Haskell-type source model presenting both subshear and super-shear rupture speed. Results indicate that scattering plays an important role in ground motion even at short distances from the source, where source effects are thought to be dominating. In particular, peak ground motion parameters can be affected even at relatively low frequencies, implying that earthquake ground-motion simulations should include scattering also for peak ground velocity (PGV) calculations. At the same time, we find a gradual loss of the source signature in the 2-5 Hz frequency range, together with a distortion of the Mach cones in case of super-shear rupture. For more complex source models and truly heterogeneous Earth, these effects may occur even at lower frequencies. Our simulations suggests that von Karman correlation functions with correlation length between several hundred metres and few kilometres, Hurst exponent around 0.3 and standard deviation in the 5-10 per cent range

  9. Broad-band near-field ground motion simulations in 3-dimensional scattering media

    KAUST Repository

    Imperatori, W.

    2012-12-06

    The heterogeneous nature of Earth\\'s crust is manifested in the scattering of propagating seismic waves. In recent years, different techniques have been developed to include such phenomenon in broad-band ground-motion calculations, either considering scattering as a semi-stochastic or purely stochastic process. In this study, we simulate broad-band (0–10 Hz) ground motions with a 3-D finite-difference wave propagation solver using several 3-D media characterized by von Karman correlation functions with different correlation lengths and standard deviation values. Our goal is to investigate scattering characteristics and its influence on the seismic wavefield at short and intermediate distances from the source in terms of ground motion parameters. We also examine scattering phenomena, related to the loss of radiation pattern and the directivity breakdown. We first simulate broad-band ground motions for a point-source characterized by a classic ω2 spectrum model. Fault finiteness is then introduced by means of a Haskell-type source model presenting both subshear and super-shear rupture speed. Results indicate that scattering plays an important role in ground motion even at short distances from the source, where source effects are thought to be dominating. In particular, peak ground motion parameters can be affected even at relatively low frequencies, implying that earthquake ground-motion simulations should include scattering also for peak ground velocity (PGV) calculations. At the same time, we find a gradual loss of the source signature in the 2–5 Hz frequency range, together with a distortion of the Mach cones in case of super-shear rupture. For more complex source models and truly heterogeneous Earth, these effects may occur even at lower frequencies. Our simulations suggests that von Karman correlation functions with correlation length between several hundred metres and few kilometres, Hurst exponent around 0.3 and standard deviation in the 5–10 per cent

  10. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  11. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  12. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  13. Nonlinear sequential laminates reproducing hollow sphere assemblages

    Science.gov (United States)

    Idiart, Martín I.

    2007-07-01

    A special class of nonlinear porous materials with isotropic 'sequentially laminated' microstructures is found to reproduce exactly the hydrostatic behavior of 'hollow sphere assemblages'. It is then argued that this result supports the conjecture that Gurson's approximate criterion for plastic porous materials, and its viscoplastic extension of Leblond et al. (1994), may actually yield rigorous upper bounds for the hydrostatic flow stress of porous materials containing an isotropic, but otherwise arbitrary, distribution of porosity. To cite this article: M.I. Idiart, C. R. Mecanique 335 (2007).

  14. Response to Comment on "Estimating the reproducibility of psychological science".

    Science.gov (United States)

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-03-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.

  15. CRKSPH - A Conservative Reproducing Kernel Smoothed Particle Hydrodynamics Scheme

    Science.gov (United States)

    Frontiere, Nicholas; Raskin, Cody D.; Owen, J. Michael

    2017-03-01

    We present a formulation of smoothed particle hydrodynamics (SPH) that utilizes a first-order consistent reproducing kernel, a smoothing function that exactly interpolates linear fields with particle tracers. Previous formulations using reproducing kernel (RK) interpolation have had difficulties maintaining conservation of momentum due to the fact the RK kernels are not, in general, spatially symmetric. Here, we utilize a reformulation of the fluid equations such that mass, linear momentum, and energy are all rigorously conserved without any assumption about kernel symmetries, while additionally maintaining approximate angular momentum conservation. Our approach starts from a rigorously consistent interpolation theory, where we derive the evolution equations to enforce the appropriate conservation properties, at the sacrifice of full consistency in the momentum equation. Additionally, by exploiting the increased accuracy of the RK method's gradient, we formulate a simple limiter for the artificial viscosity that reduces the excess diffusion normally incurred by the ordinary SPH artificial viscosity. Collectively, we call our suite of modifications to the traditional SPH scheme Conservative Reproducing Kernel SPH, or CRKSPH. CRKSPH retains many benefits of traditional SPH methods (such as preserving Galilean invariance and manifest conservation of mass, momentum, and energy) while improving on many of the shortcomings of SPH, particularly the overly aggressive artificial viscosity and zeroth-order inaccuracy. We compare CRKSPH to two different modern SPH formulations (pressure based SPH and compatibly differenced SPH), demonstrating the advantages of our new formulation when modeling fluid mixing, strong shock, and adiabatic phenomena.

  16. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae

    Science.gov (United States)

    Cabrales-Arellano, Patricia; Islas-Flores, Tania; Thomé, Patricia E.

    2017-01-01

    Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian–dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol’s iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10–25 μM after 7 days. Strobilation under optimal conditions reached 80–100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae. PMID:28265497

  17. Indomethacin reproducibly induces metamorphosis in Cassiopea xamachana scyphistomae

    Directory of Open Access Journals (Sweden)

    Patricia Cabrales-Arellano

    2017-03-01

    Full Text Available Cassiopea xamachana jellyfish are an attractive model system to study metamorphosis and/or cnidarian–dinoflagellate symbiosis due to the ease of cultivation of their planula larvae and scyphistomae through their asexual cycle, in which the latter can bud new larvae and continue the cycle without differentiation into ephyrae. Then, a subsequent induction of metamorphosis and full differentiation into ephyrae is believed to occur when the symbionts are acquired by the scyphistomae. Although strobilation induction and differentiation into ephyrae can be accomplished in various ways, a controlled, reproducible metamorphosis induction has not been reported. Such controlled metamorphosis induction is necessary for an ensured synchronicity and reproducibility of biological, biochemical, and molecular analyses. For this purpose, we tested if differentiation could be pharmacologically stimulated as in Aurelia aurita, by the metamorphic inducers thyroxine, KI, NaI, Lugol’s iodine, H2O2, indomethacin, or retinol. We found reproducibly induced strobilation by 50 μM indomethacin after six days of exposure, and 10–25 μM after 7 days. Strobilation under optimal conditions reached 80–100% with subsequent ephyrae release after exposure. Thyroxine yielded inconsistent results as it caused strobilation occasionally, while all other chemicals had no effect. Thus, indomethacin can be used as a convenient tool for assessment of biological phenomena through a controlled metamorphic process in C. xamachana scyphistomae.

  18. Locally Optimally Emitting Clouds and the Variable Broad Emission Line Spectrum of NGC 5548

    Science.gov (United States)

    Korista, Kirk T.; Goad, Michael R.

    2000-06-01

    In recent work Baldwin et al. proposed that in the geometrically extended broad-line regions (BLRs) of quasars and active galactic nuclei, a range in line-emitting gas properties (e.g., density, column density) might exist at each radius and showed that under these conditions the broad emission line spectra of these objects may be dominated by selection effects introduced by the atomic physics and general radiative transfer within the large pool of line-emitting entities. In this picture, the light we see originates in a vast amalgam of emitters but is dominated by those emitters best able to reprocess the incident continuum into a particular emission line. We test this ``locally optimally emitting clouds'' (LOC) model against the extensive spectroscopic database of the Seyfert 1 galaxy NGC 5548. The time-averaged, integrated-light UV broad emission line spectrum from the 1993 Hubble Space Telescope (HST) monitoring campaign is reproduced via the optimization of three global geometric parameters: the outer radius, the index controlling the radial cloud covering fraction of the continuum source, and the integrated cloud covering fraction. We make an ad hoc selection from the range of successful models, and for a simple spherical BLR geometry we simulate the emission-line light curves for the 1989 IUE and 1993 HST campaigns, using the respective observed UV continuum light curves as drivers. We find good agreement between the predicted and observed light curves and lags-a demonstration of the LOC picture's viability as a means to understanding the BLR environment. Finally, we discuss the next step in developing the LOC picture, which involves the marriage of echo-mapping techniques with spectral simulation grids such as those presented here, using the constraints provided by a high-quality, temporally well-sampled spectroscopic data set.

  19. 76 FR 34087 - Broad Stakeholder Survey

    Science.gov (United States)

    2011-06-10

    ... SECURITY Broad Stakeholder Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 60-day... comments concerning the Broad Stakeholder Survey. DATES: Comments are encouraged and will be accepted until.... The Broad Stakeholder Survey is designed to gather stakeholder feedback on the effectiveness of...

  20. 78 FR 20119 - Broad Stakeholder Survey

    Science.gov (United States)

    2013-04-03

    ... SECURITY Broad Stakeholder Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 30-day... soliciting comments concerning the Broad Stakeholder Survey. DHS previously published this ICR in the Federal... responders across the Nation. The Broad Stakeholder Survey is designed to gather stakeholder feedback on...

  1. 77 FR 50144 - Broad Stakeholder Survey

    Science.gov (United States)

    2012-08-20

    ... SECURITY Broad Stakeholder Survey AGENCY: National Protection and Programs Directorate, DHS. ACTION: 60-day... comments concerning the Broad Stakeholder Survey. DATES: Comments are encouraged and will be accepted until... across the Nation. The Broad Stakeholder Survey is designed to gather stakeholder feedback on...

  2. Heart rate variability reproducibility during exercise.

    Science.gov (United States)

    McNarry, Melitta A; Lewis, Michael J

    2012-07-01

    The use of heart rate variability (HRV) parameters during exercise is not supported by appropriate reliability studies. In 80 healthy adults, ECG was recorded during three 6 min bouts of exercise, separated by 6 min of unloaded cycling. Two bouts were at a moderate intensity while the final bout was at a heavy exercise intensity. This protocol was repeated under the same conditions on three occasions, with a controlled start time (pre-determined at the first visit). Standard time and frequency domain indices of HRV were derived. Reliability was assessed by Bland–Altman plots, 95% limits of agreement and intraclass correlation coefficients (ICC). The sample size required to detect a mean difference ≥30% of the between-subject standard deviation was also estimated. There was no systematic change between days. All HRV parameters demonstrated a high degree of reproducibility during baseline (ICC range: 0.58–0.75), moderate (ICC: 0.58–0.85) and heavy intensity exercise (ICC range: 0.40–0.76). The reproducibility was slightly diminished during heavy intensity exercise relative to both unloaded baseline cycling and moderate exercise. This study indicates that HRV parameters can be reliably determined during exercise, and it underlines the importance of standardizing exercise intensity with regard to fitness levels if HRV is to be reliably determined.

  3. Novel Clostridium difficile Anti-Toxin (TcdA and TcdB Humanized Monoclonal Antibodies Demonstrate In Vitro Neutralization across a Broad Spectrum of Clinical Strains and In Vivo Potency in a Hamster Spore Challenge Model.

    Directory of Open Access Journals (Sweden)

    Hongyu Qiu

    Full Text Available Clostridium difficile (C. difficile infection (CDI is the main cause of nosocomial antibiotic-associated colitis and increased incidence of community-associated diarrhea in industrialized countries. At present, the primary treatment of CDI is antibiotic administration, which is effective but often associated with recurrence, especially in the elderly. Pathogenic strains produce enterotoxin, toxin A (TcdA, and cytotoxin, toxin B (TcdB, which are necessary for C. difficile induced diarrhea and gut pathological changes. Administration of anti-toxin antibodies provides an alternative approach to treat CDI, and has shown promising results in preclinical and clinical studies. In the current study, several humanized anti-TcdA and anti-TcdB monoclonal antibodies were generated and their protective potency was characterized in a hamster infection model. The humanized anti-TcdA (CANmAbA4 and anti-TcdB (CANmAbB4 and CANmAbB1 antibodies showed broad spectrum in vitro neutralization of toxins from clinical strains and neutralization in a mouse toxin challenge model. Moreover, co-administration of humanized antibodies (CANmAbA4 and CANmAbB4 cocktail provided a high level of protection in a dose dependent manner (85% versus 57% survival at day 22 for 50 mg/kg and 20 mg/kg doses, respectively in a hamster gastrointestinal infection (GI model. This study describes the protective effects conferred by novel neutralizing anti-toxin monoclonal antibodies against C. difficile toxins and their potential as therapeutic agents in treating CDI.

  4. A reproducible number-based sizing method for pigment-grade titanium dioxide.

    Science.gov (United States)

    Theissmann, Ralf; Kluwig, Manfred; Koch, Thomas

    2014-01-01

    A strong demand for reliable characterization methods of particulate materials is triggered by the prospect of forthcoming national and international regulations concerning the classification of nanomaterials. Scientific efforts towards standardized number-based sizing methods have so far been concentrated on model systems, such as spherical gold or silica nanoparticles. However, for industrial particulate materials, which are typically targets of regulatory efforts, characterisation is in most cases complicated by irregular particle shapes, broad size distributions and a strong tendency to agglomeration. Reliable sizing methods that overcome these obstacles, and are practical for industrial use, are still lacking. By using the example of titanium dioxide, this paper shows that both necessities are well met by the sophisticated counting algorithm presented here, which is based on the imaging of polished sections of embedded particles and subsequent automated image analysis. The data presented demonstrate that the typical difficulties of sizing processes are overcome by the proposed method of sample preparation and image analysis. In other words, a robust, reproducible and statistically reliable method is presented, which leads to a number-based size distribution of pigment-grade titanium dioxide, for example, and therefore allows reliable classification of this material according to forthcoming regulations.

  5. A reproducible number-based sizing method for pigment-grade titanium dioxide

    Directory of Open Access Journals (Sweden)

    Ralf Theissmann

    2014-10-01

    Full Text Available A strong demand for reliable characterization methods of particulate materials is triggered by the prospect of forthcoming national and international regulations concerning the classification of nanomaterials. Scientific efforts towards standardized number-based sizing methods have so far been concentrated on model systems, such as spherical gold or silica nanoparticles. However, for industrial particulate materials, which are typically targets of regulatory efforts, characterisation is in most cases complicated by irregular particle shapes, broad size distributions and a strong tendency to agglomeration. Reliable sizing methods that overcome these obstacles, and are practical for industrial use, are still lacking. By using the example of titanium dioxide, this paper shows that both necessities are well met by the sophisticated counting algorithm presented here, which is based on the imaging of polished sections of embedded particles and subsequent automated image analysis. The data presented demonstrate that the typical difficulties of sizing processes are overcome by the proposed method of sample preparation and image analysis. In other words, a robust, reproducible and statistically reliable method is presented, which leads to a number-based size distribution of pigment-grade titanium dioxide, for example, and therefore allows reliable classification of this material according to forthcoming regulations.

  6. Efficient and reproducible identification of mismatch repair deficient colon cancer

    DEFF Research Database (Denmark)

    Joost, Patrick; Bendahl, Pär-Ola; Halvarsson, Britta;

    2013-01-01

    BACKGROUND: The identification of mismatch-repair (MMR) defective colon cancer is clinically relevant for diagnostic, prognostic and potentially also for treatment predictive purposes. Preselection of tumors for MMR analysis can be obtained with predictive models, which need to demonstrate ease...... of application and favorable reproducibility. METHODS: We validated the MMR index for the identification of prognostically favorable MMR deficient colon cancers and compared performance to 5 other prediction models. In total, 474 colon cancers diagnosed ≥ age 50 were evaluated with correlation between...... and efficiently identifies MMR defective colon cancers with high sensitivity and specificity. The model shows stable performance with low inter-observer variability and favorable performance when compared to other MMR predictive models....

  7. Iterative Multistep Reproducing Kernel Hilbert Space Method for Solving Strongly Nonlinear Oscillators

    Directory of Open Access Journals (Sweden)

    Banan Maayah

    2014-01-01

    Full Text Available A new algorithm called multistep reproducing kernel Hilbert space method is represented to solve nonlinear oscillator’s models. The proposed scheme is a modification of the reproducing kernel Hilbert space method, which will increase the intervals of convergence for the series solution. The numerical results demonstrate the validity and the applicability of the new technique. A very good agreement was found between the results obtained using the presented algorithm and the Runge-Kutta method, which shows that the multistep reproducing kernel Hilbert space method is very efficient and convenient for solving nonlinear oscillator’s models.

  8. Reproducibility and reusability of scientific software

    Science.gov (United States)

    Shamir, Lior

    2017-01-01

    Information science and technology has been becoming an integral part of astronomy research, and due to the consistent growth in the size and impact of astronomical databases, that trend is bound to continue. While software is a vital part information systems and data analysis processes, in many cases the importance of the software and the standards of reporting on the use of source code has not yet elevated in the scientific communication process to the same level as other parts of the research. The purpose of the discussion is to examine the role of software in the scientific communication process in the light of transparency, reproducibility, and reusability of the research, as well as discussing software in astronomy in comparison to other disciplines.

  9. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

  10. Poor reproducibility of allergic rhinitis SNP associations.

    Directory of Open Access Journals (Sweden)

    Daniel Nilsson

    Full Text Available Replication of reported associations is crucial to the investigation of complex disease. More than 100 SNPs have previously been reported as associated with allergic rhinitis (AR, but few of these have been replicated successfully. To investigate the general reproducibility of reported AR-associations in candidate gene studies, one Swedish (352 AR-cases, 709 controls and one Singapore Chinese population (948 AR-cases, 580 controls were analyzed using 49 AR-associated SNPs. The overall pattern of P-values indicated that very few of the investigated SNPs were associated with AR. Given published odds ratios (ORs most SNPs showed high power to detect an association, but no correlations were found between the ORs of the two study populations or with published ORs. None of the association signals were in common to the two genome-wide association studies published in AR, indicating that the associations represent false positives or have much lower effect-sizes than reported.

  11. Magnetohydrodynamic stability of broad line region clouds

    CERN Document Server

    Krause, Martin; Burkert, Andreas

    2012-01-01

    Hydrodynamic stability has been a longstanding issue for the cloud model of the broad line region in active galactic nuclei. We argue that the clouds may be gravitationally bound to the supermassive black hole. If true, stabilisation by thermal pressure alone becomes even more difficult. We further argue that if magnetic fields should be present in such clouds at a level that could affect the stability properties, they need to be strong enough to compete with the radiation pressure on the cloud. This would imply magnetic field values of a few Gauss for a sample of Active Galactic Nuclei we draw from the literature. We then investigate the effect of several magnetic configurations on cloud stability in axi-symmetric magnetohydrodynamic simulations. For a purely azimuthal magnetic field which provides the dominant pressure support, the cloud first gets compressed by the opposing radiative and gravitational forces. The pressure inside the cloud then increases, and it expands vertically. Kelvin-Helmholtz and colu...

  12. Is Grannum grading of the placenta reproducible?

    Science.gov (United States)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  13. Tectonic and Kinematic Regime along the Northern Caribbean Plate Boundary: New Insights from Broad-band Modeling of the May 25, 1992, Ms = 6.9 Cabo Cruz, Cuba, Earthquake

    Science.gov (United States)

    Perrot, J.; Calais, E.; Mercier de Lépinay, B.

    On May 25th, 1992, an Ms = 6.9 earthquake occurred off the southwestern tip of Cuba, along the boundary between the Caribbean and North American plates. This earthquake was the largest to strike southern Cuba since 1917 and the largest ever recorded in that region by global seismic networks. It is therefore a key element for our understanding of the tectonic and kinematic regime along the northern Caribbean plate boundary. In order to test the previously proposed source parameters of the Cabo Cruz earthquake and to better constrain its focal mechanism, we derived a new set of source parameters from unfiltered broad-band teleseismic records. We used a hybrid ray tracing method that allows us to take into account propagation effects of seismic waves in a realistic crustal model around the source. Our solution is consistent with the long-period focal mechanism solution of Virieux et al. (1992). Our solution also models the higher frequency crustal and water layer phases. The primarily strike-slip focal mechanism has a small thrust component. Its shows an east-west trending nodal plane dipping 55° to the north that we interpret as the rupture plane since it corresponds to the geometry of the major active fault in that area. The displacement on this plane is a left-lateral strike-slip combined with a small amount of southward thrust. The result is in good agreement with the active tectonic structures observed along the Oriente fault south of Cuba. The small thrust component demonstrates that, contrary to prior belief, the transpressive regime extends along this whole segment of the Caribbean/North American plate boundary. Together with historical seismicity, it suggests that most of the stress accumulated by the Caribbean/North American plate motion is released seismically along the southern Cuban margin during relatively few but large earthquakes.

  14. A Study of Long-Term fMRI Reproducibility Using Data-Driven Analysis Methods.

    Science.gov (United States)

    Song, Xiaomu; Panych, Lawrence P; Chou, Ying-Hui; Chen, Nan-Kuei

    2014-12-01

    The reproducibility of functional magnetic resonance imaging (fMRI) is important for fMRI-based neuroscience research and clinical applications. Previous studies show considerable variation in amplitude and spatial extent of fMRI activation across repeated sessions on individual subjects even using identical experimental paradigms and imaging conditions. Most existing fMRI reproducibility studies were typically limited by time duration and data analysis techniques. Particularly, the assessment of reproducibility is complicated by a fact that fMRI results may depend on data analysis techniques used in reproducibility studies. In this work, the long-term fMRI reproducibility was investigated with a focus on the data analysis methods. Two spatial smoothing techniques, including a wavelet-domain Bayesian method and the Gaussian smoothing, were evaluated in terms of their effects on the long-term reproducibility. A multivariate support vector machine (SVM)-based method was used to identify active voxels, and compared to a widely used general linear model (GLM)-based method at the group level. The reproducibility study was performed using multisession fMRI data acquired from eight healthy adults over 1.5 years' period of time. Three regions-of-interest (ROI) related to a motor task were defined based upon which the long-term reproducibility were examined. Experimental results indicate that different spatial smoothing techniques may lead to different reproducibility measures, and the wavelet-based spatial smoothing and SVM-based activation detection is a good combination for reproducibility studies. On the basis of the ROIs and multiple numerical criteria, we observed a moderate to substantial within-subject long-term reproducibility. A reasonable long-term reproducibility was also observed from the inter-subject study. It was found that the short-term reproducibility is usually higher than the long-term reproducibility. Furthermore, the results indicate that brain

  15. DETECTION OF EXTREMELY BROAD WATER EMISSION FROM THE MOLECULAR CLOUD INTERACTING SUPERNOVA REMNANT G349.7+0.2

    Energy Technology Data Exchange (ETDEWEB)

    Rho, J. [SETI Institute, 189 N. Bernardo Avenue, Mountain View, CA 94043 (United States); Hewitt, J. W. [CRESST/University of Maryland, Baltimore County, Baltimore, MD 21250 (United States); Boogert, A. [SOFIA Science Center, NASA Ames Research Center, MS 232-11, Moffett Field, CA 94035 (United States); Kaufman, M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192-0106 (United States); Gusdorf, A., E-mail: jrho@seti.org, E-mail: john.w.hewitt@nasa.gov, E-mail: aboogert@sofia.usra.edu, E-mail: michael.kaufman@sjsu.edu, E-mail: antoine.gusdorf@lra.ens.fr [LERMA, UMR 8112 du CNRS, Observatoire de Paris, École Normale Suprieure, 24 rue Lhomond, F-75231 Paris Cedex 05 (France)

    2015-10-10

    We performed Herschel HIFI, PACS, and SPIRE observations toward the molecular cloud interacting supernova remnant G349.7+0.2. An extremely broad emission line was detected at 557 GHz from the ground state transition 1{sub 10}-1{sub 01} of ortho-water. This water line can be separated into three velocity components with widths of 144, 27, and 4 km s{sup −1}. The 144 km s{sup −1} component is the broadest water line detected to date in the literature. This extremely broad line width shows the importance of probing shock dynamics. PACS observations revealed three additional ortho-water lines, as well as numerous high-J carbon monoxide (CO) lines. No para-water lines were detected. The extremely broad water line is indicative of a high velocity shock, which is supported by the observed CO rotational diagram that was reproduced with a J-shock model with a density of 10{sup 4} cm{sup −3} and a shock velocity of 80 km s{sup −1}. Two far-infrared fine-structure lines, [O i] at 145 μm and [C ii] line at 157 μm, are also consistent with the high velocity J-shock model. The extremely broad water line could be simply from short-lived molecules that have not been destroyed in high velocity J-shocks; however, it may be from more complicated geometry such as high-velocity water bullets or a shell expanding in high velocity. We estimate the CO and H{sub 2}O densities, column densities, and temperatures by comparison with RADEX and detailed shock models.

  16. Reproducing an extreme flood with uncertain post-event information

    Science.gov (United States)

    Fuentes-Andino, Diana; Beven, Keith; Halldin, Sven; Xu, Chong-Yu; Reynolds, José Eduardo; Di Baldassarre, Giuliano

    2017-07-01

    Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum-Cunge-Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE) uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events can be added into the analysis as

  17. Surface Roughness Effects on Discharge Coefficient of Broad Crested Weir

    Directory of Open Access Journals (Sweden)

    Shaker A. Jalil

    2014-06-01

    Full Text Available The aim of this study is to investigate the effects of surface roughness sizes on the discharge coefficient for a broad crested weirs. For this purpose, three models having different lengths of broad crested weirs were tested in a horizontal flume. In each model, the surface was roughed four times. Experimental results of all models showed that the logical negative effect of roughness increased on the discharge (Q for different values of length. The performance of broad crested weir improved with decrease ratio of roughness to the weir height (Ks/P and with the increase of the total Head to the Length (H/L. An empirical equation was obtained to estimate the variation of discharge coefficient Cd in terms total head to length ratio, with total head to roughness ratio.

  18. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  19. Are classifications of proximal radius fractures reproducible?

    Directory of Open Access Journals (Sweden)

    dos Santos João BG

    2009-10-01

    Full Text Available Abstract Background Fractures of the proximal radius need to be classified in an appropriate and reproducible manner. The aim of this study was to assess the reliability of the three most widely used classification systems. Methods Elbow radiographs images of patients with proximal radius fractures were classified according to Mason, Morrey, and Arbeitsgemeinschaft für osteosynthesefragen/Association for the Study of Internal Fixation (AO/ASIF classifications by four observers with different experience with this subject to assess their intra- and inter-observer agreement. Each observer analyzed the images on three different occasions on a computer with numerical sequence randomly altered. Results We found that intra-observer agreement of Mason and Morrey classifications were satisfactory (κ = 0.582 and 0.554, respectively, while the AO/ASIF classification had poor intra-observer agreement (κ = 0.483. Inter-observer agreement was higher in the Mason (κ = 0.429-0.560 and Morrey (κ = 0.319-0.487 classifications than in the AO/ASIF classification (κ = 0.250-0.478, which showed poor reliability. Conclusion Inter- and intra-observer agreement of the Mason and Morey classifications showed overall satisfactory reliability when compared to the AO/ASIF system. The Mason classification is the most reliable system.

  20. The reproducible radio outbursts of SS Cygni

    Science.gov (United States)

    Russell, T. D.; Miller-Jones, J. C. A.; Sivakoff, G. R.; Altamirano, D.; O'Brien, T. J.; Page, K. L.; Templeton, M. R.; Körding, E. G.; Knigge, C.; Rupen, M. P.; Fender, R. P.; Heinz, S.; Maitra, D.; Markoff, S.; Migliari, S.; Remillard, R. A.; Russell, D. M.; Sarazin, C. L.; Waagen, E. O.

    2016-08-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disc material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows that the compact jet remained active throughout the outburst with no radio quenching.

  1. The reproducible radio outbursts of SS Cygni

    CERN Document Server

    Russell, T D; Sivakoff, G R; Altamirano, D; O'Brien, T J; Page, K L; Templeton, M R; Koerding, E G; Knigge, C; Rupen, M P; Fender, R P; Heinz, S; Maitra, D; Markoff, S; Migliari, S; Remillard, R A; Russell, D M; Sarazin, C L; Waagen, E O

    2016-01-01

    We present the results of our intensive radio observing campaign of the dwarf nova SS Cyg during its 2010 April outburst. We argue that the observed radio emission was produced by synchrotron emission from a transient radio jet. Comparing the radio light curves from previous and subsequent outbursts of this system (including high-resolution observations from outbursts in 2011 and 2012) shows that the typical long and short outbursts of this system exhibit reproducible radio outbursts that do not vary significantly between outbursts, which is consistent with the similarity of the observed optical, ultraviolet and X-ray light curves. Contemporaneous optical and X-ray observations show that the radio emission appears to have been triggered at the same time as the initial X-ray flare, which occurs as disk material first reaches the boundary layer. This raises the possibility that the boundary region may be involved in jet production in accreting white dwarf systems. Our high spatial resolution monitoring shows th...

  2. Reproducing the entropy structure in galaxy groups

    CERN Document Server

    Finoguenov, A; Tornatore, L; Böhringer, H

    2003-01-01

    We carry out a comparison between observations and hydrodynamic simulations of entropy profiles of groups and clusters of galaxies. We use the Tree+SPH GADGET code to simulate four halos of sizes in the M_500 = 1.0 - 16.e13 h^-1 Msun range, corresponding to poor groups up to Virgo-like clusters. We concentrate on the effect of introducing radiative cooling, star formation, and a variety of non-gravitational heating schemes on the entropy structure and the stellar fraction. We show that all the simulations result in a correct entropy profile for the Virgo-like cluster. With the heating energy budget of ~0.7 keV/particle injected at z_h=3, we are also able to reproduce the entropy profiles of groups. We obtain the flat entropy cores as a combined effect of preheating and cooling, while we achieve the high entropy at outskirts by preheating. The resulting baryon fraction locked into stars is in the 25-30% range, compared to 35-40% in the case of no preheating. Heating at higher redshift, z_h=9, strongly delays t...

  3. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  4. Reproducibility of corpus cavernosum electromyography in healthy young man

    NARCIS (Netherlands)

    Jiang, X.; Frantzen, J.; Holsheimer, J.; Meuleman, E.

    2005-01-01

    Research on reproducibility of corpus cavernosum electromyography (CC-EMG) is relevant because reproducible signals indicate a biological phenomenon and not an artefact. Reproducible signals are also required to use CC-EMG as a diagnostic tool for erectile dysfunction. The aim of this study was to a

  5. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present....... This controller may also be used for Optimizing control. The modeling and control performance is demonstrated on a fed-batch protein cultivation example. The presented methodologies lend themselves directly for application as Process Analytical Technologies (PAT)....

  6. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    , stereo and various multichannel formats) served as stimuli. On each trial, the task of the subjects was to choose the format they preferred, proceeding through all the possible pairs of the eight reproduction modes. This experiment was replicated with four types of programme material (pop and classical......-trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  7. The Vienna LTE simulators - Enabling reproducibility in wireless communications research

    Directory of Open Access Journals (Sweden)

    Mehlführer Christian

    2011-01-01

    Full Text Available Abstract In this article, we introduce MATLAB-based link and system level simulation environments for UMTS Long-Term Evolution (LTE. The source codes of both simulators are available under an academic non-commercial use license, allowing researchers full access to standard-compliant simulation environments. Owing to the open source availability, the simulators enable reproducible research in wireless communications and comparison of novel algorithms. In this study, we explain how link and system level simulations are connected and show how the link level simulator serves as a reference to design the system level simulator. We compare the accuracy of the PHY modeling at system level by means of simulations performed both with bit-accurate link level simulations and PHY-model-based system level simulations. We highlight some of the currently most interesting research questions for LTE, and explain by some research examples how our simulators can be applied.

  8. Broad Prize: Do the Successes Spread?

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    When the Broad Prize for Urban Education was created in 2002, billionaire philanthropist Eli Broad said he hoped the awards, in addition to rewarding high-performing school districts, would foster healthy competition; boost the prestige of urban education, long viewed as dysfunctional; and showcase best practices. Over the 10 years the prize has…

  9. Broad Prize: Do the Successes Spread?

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    When the Broad Prize for Urban Education was created in 2002, billionaire philanthropist Eli Broad said he hoped the awards, in addition to rewarding high-performing school districts, would foster healthy competition; boost the prestige of urban education, long viewed as dysfunctional; and showcase best practices. Over the 10 years the prize has…

  10. Reversible and Reproducible Giant Universal Electroresistance Effect

    Institute of Scientific and Technical Information of China (English)

    SYED Rizwan; ZHANG Sen; YU Tian; ZHAO Yong-Gang; ZHANG Shu-Feng; HAN Xiu-Feng

    2011-01-01

    After the prediction of the giant electroresistance effect, much work has been carried out to find this effect in practical devices. We demonstrate a novel way to obtain a large electroresistance (ER) effect in the multilayer system at room temperature. The current-in-plane (CIP) electric transport measurement is performed on the multilayer structure consisting of (011)-Pb(Mg1/3Nb2/3)O3-PbTiO3(PMN-PT)/Ta/Al-O/metal. It is found that the resistance of the top metallic layer shows a hysteretic behavior as a function electric field, which corresponds well with the substrate polarization versus electric Reid (P-E) loop. This reversible hysteretic R-E behavior is independent of the applied magnetic field as well as the magnetic structure of the top metallic layer and keeps its memory state. This novel memory effect is attributed to the polarization reversal induced electrostatic potential, which is felt throughout the multilayer stack and is enhanced by the dielectric Al-O layer producing unique hysteretic, reversible, and reproducible resistance switching behavior. This novel universal electroresistance effect will open a new gateway to the development of future multiferroic memory devices operating at room temperature.%After the prediction of the giant electroresistance effect,much work has been carried out to find this effect in practical devices.We demonstrate a novel way to obtain a large electroresistance (ER) effect in the multilayer system at room temperature.The current-in-plane (CIP) electric transport measurement is performed on the multilayer structure consisting of (011)-Pb(Mg1/3 Nb2/3) O3-PbTiO3 (PMN-PT)/Ta/Al-O/metal.It is found that the resistance of the top metallic layer shows a hysteretic behavior as a function electric field,which corresponds well with the substrate polarization versus electric field (P-E) loop.This reversible hysteretic R-E behavior is independent of the applied magnetic field as well as the magnetic structure of the top metallic

  11. Prion pathogenesis is faithfully reproduced in cerebellar organotypic slice cultures.

    Directory of Open Access Journals (Sweden)

    Jeppe Falsig

    Full Text Available Prions cause neurodegeneration in vivo, yet prion-infected cultured cells do not show cytotoxicity. This has hampered mechanistic studies of prion-induced neurodegeneration. Here we report that prion-infected cultured organotypic cerebellar slices (COCS experienced progressive spongiform neurodegeneration closely reproducing prion disease, with three different prion strains giving rise to three distinct patterns of prion protein deposition. Neurodegeneration did not occur when PrP was genetically removed from neurons, and a comprehensive pharmacological screen indicated that neurodegeneration was abrogated by compounds known to antagonize prion replication. Prion infection of COCS and mice led to enhanced fodrin cleavage, suggesting the involvement of calpains or caspases in pathogenesis. Accordingly, neurotoxicity and fodrin cleavage were prevented by calpain inhibitors but not by caspase inhibitors, whereas prion replication proceeded unimpeded. Hence calpain inhibition can uncouple prion replication from its neurotoxic sequelae. These data validate COCS as a powerful model system that faithfully reproduces most morphological hallmarks of prion infections. The exquisite accessibility of COCS to pharmacological manipulations was instrumental in recognizing the role of calpains in neurotoxicity, and significantly extends the collection of tools necessary for rigorously dissecting prion pathogenesis.

  12. Faster, More Reproducible DESI-MS for Biological Tissue Imaging

    Science.gov (United States)

    Tillner, Jocelyn; Wu, Vincen; Jones, Emrys A.; Pringle, Steven D.; Karancsi, Tamas; Dannhorn, Andreas; Veselkov, Kirill; McKenzie, James S.; Takats, Zoltan

    2017-10-01

    A new, more robust sprayer for desorption electrospray ionization (DESI) mass spectrometry imaging is presented. The main source of variability in DESI is thought to be the uncontrolled variability of various geometric parameters of the sprayer, primarily the position of the solvent capillary, or more specifically, its positioning within the gas capillary or nozzle. If the solvent capillary is off-center, the sprayer becomes asymmetrical, making the geometry difficult to control and compromising reproducibility. If the stiffness, tip quality, and positioning of the capillary are improved, sprayer reproducibility can be improved by an order of magnitude. The quality of the improved sprayer and its potential for high spatial resolution imaging are demonstrated on human colorectal tissue samples by acquisition of images at pixel sizes of 100, 50, and 20 μm, which corresponds to a lateral resolution of 40-60 μm, similar to the best values published in the literature. The high sensitivity of the sprayer also allows combination with a fast scanning quadrupole time-of-flight mass spectrometer. This provides up to 30 times faster DESI acquisition, reducing the overall acquisition time for a 10 mm × 10 mm rat brain sample to approximately 1 h. Although some spectral information is lost with increasing analysis speed, the resulting data can still be used to classify tissue types on the basis of a previously constructed model. This is particularly interesting for clinical applications, where fast, reliable diagnosis is required. [Figure not available: see fulltext.

  13. Research Reproducibility in Geosciences: Current Landscape, Practices and Perspectives

    Science.gov (United States)

    Yan, An

    2016-04-01

    Reproducibility of research can gauge the validity of its findings. Yet currently we lack understanding of how much of a problem research reproducibility is in geosciences. We developed an online survey on faculty and graduate students in geosciences, and received 136 responses from research institutions and universities in Americas, Asia, Europe and other parts of the world. This survey examined (1) the current state of research reproducibility in geosciences by asking researchers' experiences with unsuccessful replication work, and what obstacles that lead to their replication failures; (2) the current reproducibility practices in community by asking what efforts researchers made to try to reproduce other's work and make their own work reproducible, and what the underlying factors that contribute to irreproducibility are; (3) the perspectives on reproducibility by collecting researcher's thoughts and opinions on this issue. The survey result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing. Only one third of the respondents received helpful feedbacks when they contacted the authors of a published study for data, code, or other information. The primary factors that lead to unsuccessful replication attempts are insufficient details of instructions in published literature, and inaccessibility of data, code and tools needed in the study. Our findings suggest a remarkable lack of research reproducibility in geoscience. Changing the incentive mechanism in academia, as well as developing policies and tools that facilitate open data and code sharing are the promising ways for geosciences community to alleviate this reproducibility problem.

  14. Bid Optimization in Broad-Match Ad auctions

    CERN Document Server

    Even-dar, Eyal; Mirrokni, Vahab; Muthukrishnan, S; Nadav, Uri

    2009-01-01

    Ad auctions in sponsored search support ``broad match'' that allows an advertiser to target a large number of queries while bidding only on a limited number. While giving more expressiveness to advertisers, this feature makes it challenging to optimize bids to maximize their returns: choosing to bid on a query as a broad match because it provides high profit results in one bidding for related queries which may yield low or even negative profits. We abstract and study the complexity of the {\\em bid optimization problem} which is to determine an advertiser's bids on a subset of keywords (possibly using broad match) so that her profit is maximized. In the query language model when the advertiser is allowed to bid on all queries as broad match, we present an linear programming (LP)-based polynomial-time algorithm that gets the optimal profit. In the model in which an advertiser can only bid on keywords, ie., a subset of keywords as an exact or broad match, we show that this problem is not approximable within any ...

  15. Building Consensus on Community Standards for Reproducible Science

    Science.gov (United States)

    Lehnert, K. A.; Nielsen, R. L.

    2015-12-01

    As geochemists, the traditional model by which standard methods for generating, presenting, and using data have been generated relied on input from the community, the results of seminal studies, a variety of authoritative bodies, and has required a great deal of time. The rate of technological and related policy change has accelerated to the point that this historical model does not satisfy the needs of the community, publishers, or funders. The development of a new mechanism for building consensus raises a number of questions: Which aspects of our data are the focus of reproducibility standards? Who sets the standards? How do we subdivide the development of the consensus? We propose an open, transparent, and inclusive approach to the development of data and reproducibility standards that is organized around specific sub-disciplines and driven by the community of practitioners in those sub-disciplines. It should involve editors, program managers, and representatives of domain data facilities as well as professional societies, but avoid any single group to be the final authority. A successful example of this model is the Editors Roundtable, a cross section of editors, funders, and data facility managers that discussed and agreed on leading practices for the reporting of geochemical data in publications, including accessibility and format of the data, data quality information, and metadata and identifiers for samples (Goldstein et al., 2014). We argue that development of data and reproducibility standards needs to heavily rely on representatives from the community of practitioners to set priorities and provide perspective. Groups of editors, practicing scientists, and other stakeholders would be assigned the task of reviewing existing practices and recommending changes as deemed necessary. They would weigh the costs and benefits of changing the standards for that community, propose appropriate tools to facilitate those changes, work through the professional societies

  16. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  17. Influenza virus antigenicity and broadly neutralizing epitopes.

    Science.gov (United States)

    Air, Gillian M

    2015-04-01

    A vaccine formulation that would be effective against all strains of influenza virus has long been a goal of vaccine developers, but antibodies after infection or vaccination were seen to be strain specific and there was little evidence of cross-reactive antibodies that neutralized across subtypes. Recently a number of broadly neutralizing monoclonal antibodies have been characterized. This review describes the different classes of broadly neutralizing antibodies and discusses the potential of their therapeutic use or for design of immunogens that induce a high proportion of broadly neutralizing antibodies.

  18. Mathematical Development: The Role of Broad Cognitive Processes

    Science.gov (United States)

    Calderón-Tena, Carlos O.

    2016-01-01

    This study investigated the role of broad cognitive processes in the development of mathematics skills among children and adolescents. Four hundred and forty-seven students (age mean [M] = 10.23 years, 73% boys and 27% girls) from an elementary school district in the US southwest participated. Structural equation modelling tests indicated that…

  19. Mathematical Development: The Role of Broad Cognitive Processes

    Science.gov (United States)

    Calderón-Tena, Carlos O.

    2016-01-01

    This study investigated the role of broad cognitive processes in the development of mathematics skills among children and adolescents. Four hundred and forty-seven students (age mean [M] = 10.23 years, 73% boys and 27% girls) from an elementary school district in the US southwest participated. Structural equation modelling tests indicated that…

  20. Can atmospheric reanalysis datasets be used to reproduce flood characteristics?

    Science.gov (United States)

    Andreadis, K.; Schumann, G.; Stampoulis, D.

    2014-12-01

    Floods are one of the costliest natural disasters and the ability to understand their characteristics and their interactions with population, land cover and climate changes is of paramount importance. In order to accurately reproduce flood characteristics such as water inundation and heights both in the river channels and floodplains, hydrodynamic models are required. Most of these models operate at very high resolutions and are computationally very expensive, making their application over large areas very difficult. However, a need exists for such models to be applied at regional to global scales so that the effects of climate change with regards to flood risk can be examined. We use the LISFLOOD-FP hydrodynamic model to simulate a 40-year history of flood characteristics at the continental scale, particularly over Australia. LISFLOOD-FP is a 2-D hydrodynamic model that solves the approximate Saint-Venant equations at large scales (on the order of 1 km) using a sub-grid representation of the river channel. This implementation is part of an effort towards a global 1-km flood modeling framework that will allow the reconstruction of a long-term flood climatology. The components of this framework include a hydrologic model (the widely-used Variable Infiltration Capacity model) and a meteorological dataset that forces it. In order to extend the simulated flood climatology to 50-100 years in a consistent manner, reanalysis datasets have to be used. The objective of this study is the evaluation of multiple atmospheric reanalysis datasets (ERA, NCEP, MERRA, JRA) as inputs to the VIC/LISFLOOD-FP model. Comparisons of the simulated flood characteristics are made with both satellite observations of inundation and a benchmark simulation of LISFLOOD-FP being forced by observed flows. Finally, the implications of the availability of a global flood modeling framework for producing flood hazard maps and disseminating disaster information are discussed.

  1. Broad spectrum microarray for fingerprint-based bacterial species identification

    Directory of Open Access Journals (Sweden)

    Frey Jürg E

    2010-02-01

    Full Text Available Abstract Background Microarrays are powerful tools for DNA-based molecular diagnostics and identification of pathogens. Most target a limited range of organisms and are based on only one or a very few genes for specific identification. Such microarrays are limited to organisms for which specific probes are available, and often have difficulty discriminating closely related taxa. We have developed an alternative broad-spectrum microarray that employs hybridisation fingerprints generated by high-density anonymous markers distributed over the entire genome for identification based on comparison to a reference database. Results A high-density microarray carrying 95,000 unique 13-mer probes was designed. Optimized methods were developed to deliver reproducible hybridisation patterns that enabled confident discrimination of bacteria at the species, subspecies, and strain levels. High correlation coefficients were achieved between replicates. A sub-selection of 12,071 probes, determined by ANOVA and class prediction analysis, enabled the discrimination of all samples in our panel. Mismatch probe hybridisation was observed but was found to have no effect on the discriminatory capacity of our system. Conclusions These results indicate the potential of our genome chip for reliable identification of a wide range of bacterial taxa at the subspecies level without laborious prior sequencing and probe design. With its high resolution capacity, our proof-of-principle chip demonstrates great potential as a tool for molecular diagnostics of broad taxonomic groups.

  2. Reproducing the Kinematics of Damped Lyman-alpha Systems

    CERN Document Server

    Bird, Simeon; Neeleman, Marcel; Genel, Shy; Vogelsberger, Mark; Hernquist, Lars

    2014-01-01

    We examine the kinematic structure of Damped Lyman-alpha Systems (DLAs) in a series of cosmological hydrodynamic simulations using the AREPO code. We are able to match the distribution of velocity widths of associated low ionisation metal absorbers substantially better than earlier work. Our simulations produce a population of DLAs dominated by halos with virial velocities around 70 km/s, consistent with a picture of relatively small, faint objects. In addition, we reproduce the observed correlation between velocity width and metallicity and the equivalent width distribution of SiII. Some discrepancies of moderate statistical significance remain; too many of our spectra show absorption concentrated at the edge of the profile and there are slight differences in the exact shape of the velocity width distribution. We show that the improvement over previous work is mostly due to our strong feedback from star formation and our detailed modelling of the metal ionisation state.

  3. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Document Server

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  4. Synthetic Stellar Photometry. I-General considerations and new transformations for broad-band systems

    CERN Document Server

    Casagrande, Luca

    2014-01-01

    After a pedagogical introduction to the main concepts of synthetic photometry, colours and bolometric corrections in the Johnson-Cousins, 2MASS, and HST-ACS/WFC3 photometric systems are generated from MARCS synthetic fluxes for various [Fe/H] and [alpha/Fe] combinations, and virtually any value of reddening E(B-V) < 0.7. The successes and failures of model fluxes in reproducing the observed magnitudes are highlighted. Overall, extant synthetic fluxes predict quite realistic broad-band colours and bolometric corrections, especially at optical and longer wavelengths: further improvements of the predictions for the blue and ultraviolet spectral regions await the use of hydrodynamic models where the microturbulent velocity is not treated as a free parameter. We show how the morphology of the colour-magnitude diagram (CMD) changes for different values of [Fe/H] and [alpha/Fe]; in particular, how suitable colour combinations can easily discriminate between red giant branch and lower main sequence populations wit...

  5. Measuring Prevention More Broadly, An Empirical...

    Data.gov (United States)

    U.S. Department of Health & Human Services — Measuring Prevention More Broadly, An Empirical Assessment of CHIPRA Core Measures Differences in CHIP design and structure, across states and over time, may limit...

  6. Estimation of contrast agent bolus arrival delays for improved reproducibility of liver DCE MRI

    Science.gov (United States)

    Chouhan, Manil D.; Bainbridge, Alan; Atkinson, David; Punwani, Shonit; Mookerjee, Rajeshwar P.; Lythgoe, Mark F.; Taylor, Stuart A.

    2016-10-01

    Delays between contrast agent (CA) arrival at the site of vascular input function (VIF) sampling and the tissue of interest affect dynamic contrast enhanced (DCE) MRI pharmacokinetic modelling. We investigate effects of altering VIF CA bolus arrival delays on liver DCE MRI perfusion parameters, propose an alternative approach to estimating delays and evaluate reproducibility. Thirteen healthy volunteers (28.7  ±  1.9 years, seven males) underwent liver DCE MRI using dual-input single compartment modelling, with reproducibility (n  =  9) measured at 7 days. Effects of VIF CA bolus arrival delays were assessed for arterial and portal venous input functions. Delays were pre-estimated using linear regression, with restricted free modelling around the pre-estimated delay. Perfusion parameters and 7 days reproducibility were compared using this method, freely modelled delays and no delays using one-way ANOVA. Reproducibility was assessed using Bland-Altman analysis of agreement. Maximum percent change relative to parameters obtained using zero delays, were  -31% for portal venous (PV) perfusion, +43% for total liver blood flow (TLBF), +3247% for hepatic arterial (HA) fraction, +150% for mean transit time and  -10% for distribution volume. Differences were demonstrated between the 3 methods for PV perfusion (p  =  0.0085) and HA fraction (p  liver DCE MRI quantification. Pre-estimation of delays with constrained free modelling improved 7 days reproducibility of perfusion parameters in volunteers.

  7. Reproducing an extreme flood with uncertain post-event information

    Directory of Open Access Journals (Sweden)

    D. Fuentes-Andino

    2017-07-01

    Full Text Available Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum–Cunge–Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events

  8. Theoretical comments on reproducibility and normalization of TWA measures.

    Science.gov (United States)

    Sassi, Roberto; Mainardi, Luca T

    2013-01-01

    Using a simple stochastic model of ventricular repolarization and the equivalent surface source (ESS) model, an electrophysiological formulation relating surface ECG to variations at the myocytes' level, we recently pointed out a few theoretical results regarding T-wave alternans (TWA). In this paper, stimulated by the comments of John E. Madias on our paper (J Electrocardiol, 2012), we further explored the consequences implied by the theoretical model. First, we verified the reproducibility of TWA measures, in clinically stable patients repeatedly tested. The sensitivity to displacement was evaluated simulating lead mislocations of up to 20mm. The numerical simulations were performed on data obtained solving the inverse electrocardiographically problem from three subjects (ECGSIM). The results showed that TWA sensitivity varies across leads, being maximal in V1 and decreases towards V6. Globally, the maximal percent error found was 6.1%. Thus, TWA measures do not seem to add more stringent requirements on lead placement's precision, than the usual diagnostic practice. Finally, we further discussed the implications of normalizing TWA measures. While clinical studies are necessary to sort out the issue, the theoretical model suggests that normalization might be appropriate only is certain cases.

  9. Workflow to numerically reproduce laboratory ultrasonic datasets

    Institute of Scientific and Technical Information of China (English)

    A. Biryukov; N. Tisato; G. Grasselli

    2014-01-01

    The risks and uncertainties related to the storage of high-level radioactive waste (HLRW) can be reduced thanks to focused studies and investigations. HLRWs are going to be placed in deep geological re-positories, enveloped in an engineered bentonite barrier, whose physical conditions are subjected to change throughout the lifespan of the infrastructure. Seismic tomography can be employed to monitor its physical state and integrity. The design of the seismic monitoring system can be optimized via con-ducting and analyzing numerical simulations of wave propagation in representative repository geometry. However, the quality of the numerical results relies on their initial calibration. The main aim of this paper is to provide a workflow to calibrate numerical tools employing laboratory ultrasonic datasets. The finite difference code SOFI2D was employed to model ultrasonic waves propagating through a laboratory sample. Specifically, the input velocity model was calibrated to achieve a best match between experi-mental and numerical ultrasonic traces. Likely due to the imperfections of the contact surfaces, the resultant velocities of P- and S-wave propagation tend to be noticeably lower than those a priori assigned. Then, the calibrated model was employed to estimate the attenuation in a montmorillonite sample. The obtained low quality factors (Q) suggest that pronounced inelastic behavior of the clay has to be taken into account in geophysical modeling and analysis. Consequently, this contribution should be considered as a first step towards the creation of a numerical tool to evaluate wave propagation in nuclear waste repositories.

  10. On the Inclusion Relation of Reproducing Kernel Hilbert Spaces

    OpenAIRE

    Zhang, Haizhang; Zhao, Liang

    2011-01-01

    To help understand various reproducing kernels used in applied sciences, we investigate the inclusion relation of two reproducing kernel Hilbert spaces. Characterizations in terms of feature maps of the corresponding reproducing kernels are established. A full table of inclusion relations among widely-used translation invariant kernels is given. Concrete examples for Hilbert-Schmidt kernels are presented as well. We also discuss the preservation of such a relation under various operations of ...

  11. ReproPhylo: An Environment for Reproducible Phylogenomics

    OpenAIRE

    2015-01-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user interve...

  12. Flow structure in front of the broad-crested weir

    Directory of Open Access Journals (Sweden)

    Zachoval Zbyněk

    2015-01-01

    Full Text Available The paper deals with research focused on description of flow structure in front of broad-crested weir. Based on experimental measurement, the flow structure in front of the weir (the recirculation zone of flow and tornado vortices and flow structure on the weir crest has been described. The determined flow character has been simulated using numerical model and based on comparing results the suitable model of turbulence has been recommended.

  13. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  14. Virtual Reference Environments: a simple way to make research reproducible.

    Science.gov (United States)

    Hurley, Daniel G; Budden, David M; Crampin, Edmund J

    2015-09-01

    'Reproducible research' has received increasing attention over the past few years as bioinformatics and computational biology methodologies become more complex. Although reproducible research is progressing in several valuable ways, we suggest that recent increases in internet bandwidth and disk space, along with the availability of open-source and free-software licences for tools, enable another simple step to make research reproducible. In this article, we urge the creation of minimal virtual reference environments implementing all the tools necessary to reproduce a result, as a standard part of publication. We address potential problems with this approach, and show an example environment from our own work.

  15. A workflow for reproducing mean benthic gas fluxes

    Science.gov (United States)

    Fulweiler, Robinson W.; Emery, Hollie E.; Maguire, Timothy J.

    2016-08-01

    Long-term data sets provide unique opportunities to examine temporal variability of key ecosystem processes. The need for such data sets is becoming increasingly important as we try to quantify the impact of human activities across various scales and in some cases, as we try to determine the success of management interventions. Unfortunately, long-term benthic flux data sets for coastal ecosystems are rare and curating them is a challenge. If we wish to make our data available to others now and into the future, however, then we need to provide mechanisms that allow others to understand our methods, access the data, reproduce the results, and see updates as they become available. Here we use techniques, learned through the EarthCube Ontosoft Geoscience Paper of the Future project, to develop best practices to allow us to share a long-term data set of directly measured net sediment N2 fluxes and sediment oxygen demand at two sites in Narragansett Bay, Rhode Island (USA). This technical report describes the process we used, the challenges we faced, and the steps we will take in the future to ensure transparency and reproducibility. By developing these data and software sharing tools we hope to help disseminate well-curated data with provenance as well as products from these data, so that the community can better assess how this temperate estuary has changed over time. We also hope to provide a data sharing model for others to follow so that long-term estuarine data are more easily shared and not lost over time.

  16. Organ-on-a-Chip Technology for Reproducing Multiorgan Physiology.

    Science.gov (United States)

    Lee, Seung Hwan; Sung, Jong Hwan

    2017-09-25

    In the drug development process, the accurate prediction of drug efficacy and toxicity is important in order to reduce the cost, labor, and effort involved. For this purpose, conventional 2D cell culture models are used in the early phase of drug development. However, the differences between the in vitro and the in vivo systems have caused the failure of drugs in the later phase of the drug-development process. Therefore, there is a need for a novel in vitro model system that can provide accurate information for evaluating the drug efficacy and toxicity through a closer recapitulation of the in vivo system. Recently, the idea of using microtechnology for mimicking the microscale tissue environment has become widespread, leading to the development of "organ-on-a-chip." Furthermore, the system is further developed for realizing a multiorgan model for mimicking interactions between multiple organs. These advancements are still ongoing and are aimed at ultimately developing "body-on-a-chip" or "human-on-a-chip" devices for predicting the response of the whole body. This review summarizes recently developed organ-on-a-chip technologies, and their applications for reproducing multiorgan functions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Broad-Band Spectroscopy of Hercules X-1 with Suzaku

    Science.gov (United States)

    Asami, Fumi; Enoto, Teruaki; Iwakiri, Wataru; Yamada, Shin'ya; Tamagawa, Toru; Mihara, Tatehiro; Nagase, Fumiaki

    2014-01-01

    Hercules X-1 was observed with Suzaku in the main-on state from 2005 to 2010. The 0.4- 100 keV wide-band spectra obtained in four observations showed a broad hump around 4-9 keV in addition to narrow Fe lines at 6.4 and 6.7 keV. The hump was seen in all the four observations regardless of the selection of the continuum models. Thus it is considered a stable and intrinsic spectral feature in Her X-1. The broad hump lacked a sharp structure like an absorption edge. Thus it was represented by two different spectral models: an ionized partial covering or an additional broad line at 6.5 keV. The former required a persistently existing ionized absorber, whose origin was unclear. In the latter case, the Gaussian fitting of the 6.5-keV line needs a large width of sigma = 1.0-1.5 keV and a large equivalent width of 400-900 eV. If the broad line originates from Fe fluorescence of accreting matter, its large width may be explained by the Doppler broadening in the accretion flow. However, the large equivalent width may be inconsistent with a simple accretion geometry.

  18. Broad-Band Spectroscopy of Hercules X-1 with Suzaku

    Science.gov (United States)

    Asami, Fumi; Enoto, Teruaki; Iwakiri, Wataru; Yamada, Shin'ya; Tamagawa, Toru; Mihara, Tatehiro; Nagase, Fumiaki

    2014-01-01

    Hercules X-1 was observed with Suzaku in the main-on state from 2005 to 2010. The 0.4- 100 keV wide-band spectra obtained in four observations showed a broad hump around 4-9 keV in addition to narrow Fe lines at 6.4 and 6.7 keV. The hump was seen in all the four observations regardless of the selection of the continuum models. Thus it is considered a stable and intrinsic spectral feature in Her X-1. The broad hump lacked a sharp structure like an absorption edge. Thus it was represented by two different spectral models: an ionized partial covering or an additional broad line at 6.5 keV. The former required a persistently existing ionized absorber, whose origin was unclear. In the latter case, the Gaussian fitting of the 6.5-keV line needs a large width of sigma = 1.0-1.5 keV and a large equivalent width of 400-900 eV. If the broad line originates from Fe fluorescence of accreting matter, its large width may be explained by the Doppler broadening in the accretion flow. However, the large equivalent width may be inconsistent with a simple accretion geometry.

  19. Broad-band photometric evolution of star clusters

    OpenAIRE

    Girardi, Leo

    2001-01-01

    I briefly introduce a database of models that describe the evolution of star clusters in several broad-band photometric systems. Models are based on the latest Padova stellar evolutionary tracks - now including the alpha-enhanced case and improved AGB models - and a revised library of synthetic spectra from model atmospheres. As of today, we have revised isochrones in Johnson-Cousins-Glass, HST/WFPC2, HST/NICMOS, Thuan-Gunn, and Washington systems. Several other filter sets are included in a ...

  20. 46 CFR 56.30-3 - Piping joints (reproduces 110).

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Piping joints (reproduces 110). 56.30-3 Section 56.30-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PIPING SYSTEMS AND APPURTENANCES Selection and Limitations of Piping Joints § 56.30-3 Piping joints (reproduces 110). The type...

  1. Reproducibility along a 10 cm vertical visual analogue scale.

    OpenAIRE

    Dixon, J. S.; Bird, H A

    1981-01-01

    Reproducibility along a vertical 10 cm visual analogue scale (VAS) was investigated. Eight normal volunteers attempted to duplicate a set of marked VAS. There was a tendency to estimate too high on the scale, and reproducibility was found to be variable along its length. This indicates that the error involved in the use of VASs is even more complex than previously thought.

  2. Completely reproducible description of digital sound data with cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-12-30

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information.

  3. Development of Reproducing Alumina-Magnesia-Carbon Bricks

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The reproducing alumina-magnesia-carbon bricks were prepared with the dumped bricks as starting materials. The bulk density, apparent porosity, crushing strength, modolus of rupture and slag resistance of the specimen were analyzed. The results show that the used refractories can be reused and recycled by the right method. The reproducing alumina-magnesia-carbon bricks with better abilities were prepared.

  4. Giant Broad Line Regions in Dwarf Seyferts

    CERN Document Server

    Devereux, Nick

    2015-01-01

    High angular resolution spectroscopy obtained with the Hubble Space Telescope (HST) has revealed a remarkable population of galaxies hosting dwarf Seyfert nuclei with an unusually large broad-line region (BLR). These objects are remarkable for two reasons. Firstly, the size of the BLR can, in some cases, rival those seen in the most luminous quasars. Secondly, the size of the BLR is not correlated with the central continuum luminosity, an observation that distinguishes them from their reverberating counterparts. Collectively, these early results suggest that non-reverberating dwarf Seyferts are a heterogeneous group and not simply scaled versions of each other. Careful inspection reveals broad H Balmer emission lines with single peaks, double peaks, and a combination of the two, suggesting that the broad emission lines are produced in kinematically distinct regions centered on the black hole (BH). Because the gravitational field strength is already known for these objects, by virtue of knowing their BH mass, ...

  5. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  6. Teaching the Broad, Interdisciplinary Impact of Evolution

    Science.gov (United States)

    Benson, David; Atlas, Pierre; Haberski, Raymond; Higgs, Jamie; Kiley, Patrick; Maxwell, Michael, Jr.; Mirola, William; Norton, Jamey

    2009-01-01

    As perhaps the most encompassing idea in biology, evolution has impacted not only science, but other academic disciplines as well. The broad, interdisciplinary impact of evolution was the theme of a course taught at Marian College, Indianapolis, Indiana in 2002, 2004, and 2006. Using a strategy that could be readily adopted at other institutions,…

  7. The GREGOR Broad-Band Imager

    Science.gov (United States)

    von der Lühe, O.; Volkmer, R.; Kentischer, T. J.; Geißler, R.

    2012-11-01

    The design and characteristics of the Broad-Band Imager (BBI) of GREGOR are described. BBI covers the visible spectral range with two cameras simultaneously for a large field and with critical sampling at 390 nm, and it includes a mode for observing the pupil in a Foucault configuration. Samples of first-light observations are shown.

  8. Electrospraying, a Reproducible Method for Production of Polymeric Microspheres for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Tim R. Dargaville

    2011-01-01

    Full Text Available The ability to reproducibly load bioactive molecules into polymeric microspheres is a challenge. Traditional microsphere fabrication methods typically provide inhomogeneous release profiles and suffer from lack of batch to batch reproducibility, hindering their potential to up-scale and their translation to the clinic. This deficit in homogeneity is in part attributed to broad size distributions and variability in the morphology of particles. It is thus desirable to control morphology and size of non-loaded particles in the first instance, in preparation for obtaining desired release profiles of loaded particles in the later stage. This is achieved by identifying the key parameters involved in particle production and understanding how adapting these parameters affects the final characteristics of particles. In this study, electrospraying was presented as a promising technique for generating reproducible particles made of polycaprolactone, a biodegradable, FDA-approved polymer. Narrow size distributions were obtained by the control of electrospraying flow rate and polymer concentration, with average particle sizes ranging from 10 to 20 µm. Particles were shown to be spherical with a homogeneous embossed texture, determined by the polymer entanglement regime taking place during electrospraying. No toxic residue was detected by this process based on preliminary cell work using DNA quantification assays, validating this method as suitable for further loading of bioactive components.

  9. Broad-Band Molecular Polarization in White Dwarfs

    Science.gov (United States)

    Berdyugina, S. V.; Berdyugin, A. V.; Piirola, V.; Shapiro, A.

    2007-09-01

    We present novel calculations of broad-band polarization due to the molecular Paschen--Back effect in a strong magnetic field. Based on that, we analyze new spectropolarimetric observations of the cool magnetic helium-rich white dwarf G 99-37 which shows strongly polarized molecular bands in its spectrum. Combining the polarimetric observations with our model calculations for the CH bands at 4300 Å, we deduce a magnetic field of 8 MG on this unique magnetic white dwarf.

  10. Attempts to reproduce vacuolar myelinopathy in domestic swine and chickens.

    Science.gov (United States)

    Lewis-Weis, Lynn A; Gerhold, Richard W; Fischer, John R

    2004-07-01

    Avian vacuolar myelinopathy (AVM) was first recognized as a cause of bald eagle (Haliaeetus leucocephalus) mortality in 1994 in Arkansas (USA) and has since caused over 90 bald eagle and numerous American coot (Fulica americana) mortalities in five southeastern states. The cause of AVM remains undetermined but is suspected to be a biotoxin. Naturally occurring AVM has been limited to wild waterbirds, raptors, and one species of shorebird, and has been reproduced experimentally in red-tailed hawks (Buteo jamaicensis). In this study, chickens and swine were evaluated for susceptibility to vacuolar myelinopathy with the intent of developing animal models for research and to identify specific tissues in affected coots that contain the causative agent. Additionally, submerged, aquatic vegetation, primarily hydrilla (Hydrilla verticillata), and associated material collected from a reservoir during an AVM outbreak was fed to chickens in an effort to reproduce the disease. In two separate experiments, six 4-wk-old leghorn chickens and ten 5-wk-old leghorn chickens were fed coot tissues. In a third experiment, five 3-mo-old domestic swine and one red-tailed hawk, serving as a positive control, were fed coot tissues. In these experiments, treatment animals received tissues (brain, fat, intestinal tract, kidney, liver, and/or muscle) from coots with AVM lesions collected at a lake during an AVM outbreak. Negative control chickens and one pig received tissues from coots without AVM lesions that had been collected at a lake where AVM has never been documented. In a fourth experiment, eight 3-wk-old leghorn chickens were fed aquatic vegetation material. Four chickens received material from the same lake from which coots with AVM lesions were collected for the previous experiments, and four control chickens were fed material from the lake where AVM has never been documented. Blood was collected and physical and neurologic exams were conducted on animals before and once per week

  11. Reproducible and controllable induction voltage adder for scaled beam experiments

    Science.gov (United States)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko

    2016-08-01

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  12. In-vitro accuracy and reproducibility evaluation of probing depth measurements of selected periodontal probes

    Directory of Open Access Journals (Sweden)

    K.N. Al Shayeb

    2014-01-01

    Conclusion: Depth measurements with the Chapple UB-CF-15 probe were more accurate and reproducible compared to measurements with the Vivacare TPS and Williams 14 W probes. This in vitro model may be useful for intra-examiner calibration or clinician training prior to the clinical evaluation of patients or in longitudinal studies involving periodontal evaluation.

  13. Lack of reproducibility of linkage results in serially measured blood pressure data

    NARCIS (Netherlands)

    Patel, [No Value; Celedon, JC; Weiss, ST; Palmer, LJ

    2003-01-01

    Background: Using the longitudinal Framingham Heart Study data on blood pressure, we analyzed the reproducibility of linkage measures from serial cross-sectional surveys of a defined population by performing genome-wide model-free linkage analyses to systolic blood pressure (SBP) and history of hype

  14. Reproducible in vitro regeneration system for purifying sugarcane ...

    African Journals Online (AJOL)

    Tansgenic Lab

    2012-05-24

    May 24, 2012 ... Full Length Research Paper. Reproducible in ... regeneration was varied from basal to top sections. Nevertheless ... 2005). The application of biotechnology to sugarcane is comparatively ..... Damon RA, Harvey WR (1987).

  15. Transition questions in clinical practice - validity and reproducibility

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein

    2008-01-01

    of construct validity and reproducibility of a TQ and make proposals for standardised use. One-hundred-and-ninety-one patients with low back pain and/or leg pain were followed over an 8-week period receiving 3 disability and 2 pain questionnaires together with a 7-point TQ. Reproducibility was determined using...... are reproducible in patients with low back pain and/or leg pain. Despite critique of several biases, our results have reinforced the construct validity of TQ’s as an outcome measure since only 1 hypothesis was rejected. On the basis of our findings we have outlined a proposal for a standardised use of transition......Transition questions in CLINICAL practice - validity and reproducibility Lauridsen HH1, Manniche C3, Grunnet-Nilsson N1, Hartvigsen J1,2 1   Clinical Locomotion Science, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark. e-mail: hlauridsen...

  16. Reproducibility of ERG responses obtained with the DTL electrode.

    Science.gov (United States)

    Hébert, M; Vaegan; Lachapelle, P

    1999-03-01

    Previous investigators have suggested that the DTL fibre electrode might not be suitable for the recording of replicable electroretinograms. We present experimental evidence that when used adequately, this electrode does permit the recording of highly reproducible retinal potentials.

  17. Broadly Neutralizing Antibodies for HIV Eradication.

    Science.gov (United States)

    Stephenson, Kathryn E; Barouch, Dan H

    2016-02-01

    Passive transfer of antibodies has long been considered a potential treatment modality for infectious diseases, including HIV. Early efforts to use antibodies to suppress HIV replication, however, were largely unsuccessful, as the antibodies that were studied neutralized only a relatively narrow spectrum of viral strains and were not very potent. Recent advances have led to the discovery of a large portfolio of human monoclonal antibodies that are broadly neutralizing across many HIV-1 subtypes and are also substantially more potent. These antibodies target multiple different epitopes on the HIV envelope, thus allowing for the development of antibody combinations. In this review, we discuss the application of broadly neutralizing antibodies (bNAbs) for HIV treatment and HIV eradication strategies. We highlight bNAbs that target key epitopes, such as the CD4 binding site and the V2/V3-glycan-dependent sites, and we discuss several bNAbs that are currently in the clinical development pipeline.

  18. Making neurophysiological data analysis reproducible. Why and how?

    OpenAIRE

    Delescluse, Matthieu; Franconville, Romain; Joucla, Sébastien; Lieury, Tiffany; Pouzat, Christophe

    2011-01-01

    Manuscript submitted to "The Journal of Physiology (Paris)". Second version.; Reproducible data analysis is an approach aiming at complementing classical printed scientific articles with everything required to independently reproduce the results they present. ''Everything'' covers here: the data, the computer codes and a precise description of how the code was applied to the data. A brief history of this approach is presented first, starting with what economists have been calling replication ...

  19. Comment on "Estimating the reproducibility of psychological science".

    Science.gov (United States)

    Gilbert, Daniel T; King, Gary; Pettigrew, Stephen; Wilson, Timothy D

    2016-03-04

    A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.

  20. ON APPROXIMATION BY REPRODUCING KERNEL SPACES IN WEIGHTED Lp SPACES

    Institute of Scientific and Technical Information of China (English)

    Baohuai SHENG

    2007-01-01

    In this paper, we investigate the order of approximation by reproducing kernel spaces on (-1, 1) in weighted Lp spaces. We first restate the translation network from the view of reproducing kernel spaces and then construct a sequence of approximating operators with the help of Jacobi orthogonal polynomials, with which we establish a kind of Jackson inequality to describe the error estimate.Finally, The results are used to discuss an approximation problem arising from learning theory.

  1. Multiple and broad frequency response Gunn diodes

    Science.gov (United States)

    Pilgrim, N. J.; Macpherson, R. F.; Khalid, A.; Dunn, G. M.; Cumming, D. R. S.

    2009-10-01

    Gunn diodes, operating in transit time mode, are usually thought of as incapable of generating power at multiple frequencies or over a broad frequency range. In this paper, we report experimental results showing that these diodes can generate power at several frequencies and, using Monte Carlo simulations of both planar and vertical devices, we offer an explanation of how this unusual behaviour may come into being and suggest possible applications for this novel device.

  2. Durations of extended mental rehearsals are remarkably reproducible in higher level human performances.

    Science.gov (United States)

    Brothers, L; Shaw, G L; Wright, E L

    1993-12-01

    It has been extremely difficult to quantify temporal aspects of higher level human brain function. We have found that mental rehearsals of musical performance of several minutes duration provide such a measure in that they can be highly reproducible, varying to less than 1%. These remarkable results pose fundamental neurophysiological problems. It is necessary to understand the underlying neuronal bases for this accuracy in the spatial-temporal activity of billions of neurons over minutes without sensory input. Further, they present a powerful constraint on neuronal models of brain function. Such highly reproducible (in duration) mental rehearsals might be used in conjunction with multielectrode EEG recordings to look for reproducible spatial-temporal patterns. Further, we suggest that our results may provide an extremely useful behavioural correlate for high level performance.

  3. A Low-Cost Anthropometric Walking Robot for Reproducing Gait Lab Data

    Directory of Open Access Journals (Sweden)

    Rogério Eduardo da Silva Santana

    2008-01-01

    Full Text Available Human gait analysis is one of the resources that may be used in the study and treatment of pathologies of the locomotive system. This paper deals with the modelling and control aspects of the design, construction and testing of a biped walking robot conceived to, in limited extents, reproduce the human gait. Robot dimensions have been chosen in order to guarantee anthropomorphic proportions and then to help health professionals in gait studies. The robot has been assembled with low-cost components and can reproduce, in an assisted way, real-gait patterns generated from data previously acquired in gait laboratories. Part of the simulated and experimental results are addressed to demonstrate the ability of the biped robot in reproducing normal and pathological human gait.

  4. Flow characteristics at trapezoidal broad-crested side weir

    Directory of Open Access Journals (Sweden)

    Říha Jaromír

    2015-06-01

    Full Text Available Broad-crested side weirs have been the subject of numerous hydraulic studies; however, the flow field at the weir crest and in front of the weir in the approach channel still has not been fully described. Also, the discharge coefficient of broad-crested side weirs, whether slightly inclined towards the stream or lateral, still has yet to be clearly determined. Experimental research was carried out to describe the flow characteristics at low Froude numbers in the approach flow channel for various combinations of in- and overflow discharges. Three side weir types with different oblique angles were studied. Their flow characteristics and discharge coefficients were analyzed and assessed based on the results obtained from extensive measurements performed on a hydraulic model. The empirical relation between the angle of side weir obliqueness, Froude numbers in the up- and downstream channels, and the coefficient of obliqueness was derived.

  5. To Broad-Match or Not to Broad-Match : An Auctioneer's Dilemma ?

    CERN Document Server

    Singh, Sudhir Kumar

    2008-01-01

    We initiate the study of an interesting aspect of sponsored search advertising, namely the consequences of broad match-a feature where an ad of an advertiser can be mapped to a broader range of relevant queries, and not necessarily to the particular keyword(s) that ad is associated with. Starting with a very natural setting for strategies available to the advertisers, and via a careful look through algorithmic and complexity theoretic glasses, we first propose a solution concept called broad match equilibrium(BME) for the game originating from the strategic behavior of advertisers as they try to optimize their budget allocation across various keywords. Next, we consider two broad match scenarios based on factors such as information asymmetry between advertisers and the auctioneer, and the extent of auctioneer's control on the budget splitting. In the first scenario, the advertisers have the full information about broad match and relevant parameters, and can reapportion their own budgets to utilize the extra i...

  6. Reproducing the observed Cosmic microwave background anisotropies with causal scaling seeds

    OpenAIRE

    Durrer, R.; Kunz, M.; Melchiorri, A.

    2000-01-01

    During the last years it has become clear that global O(N) defects and U(1) cosmic strings do not lead to the pronounced first acoustic peak in the power spectrum of anisotropies of the cosmic microwave background which has recently been observed to high accuracy. Inflationary models cannot easily accommodate the low second peak indicated by the data. Here we construct causal scaling seed models which reproduce the first and second peak. Future, more precise CMB anisotropy and polarization ex...

  7. Broad spectrum antibiotic compounds and use thereof

    Energy Technology Data Exchange (ETDEWEB)

    Koglin, Alexander; Strieker, Matthias

    2016-07-05

    The discovery of a non-ribosomal peptide synthetase (NRPS) gene cluster in the genome of Clostridium thermocellum that produces a secondary metabolite that is assembled outside of the host membrane is described. Also described is the identification of homologous NRPS gene clusters from several additional microorganisms. The secondary metabolites produced by the NRPS gene clusters exhibit broad spectrum antibiotic activity. Thus, antibiotic compounds produced by the NRPS gene clusters, and analogs thereof, their use for inhibiting bacterial growth, and methods of making the antibiotic compounds are described.

  8. Crx broadly modulates the pineal transcriptome

    DEFF Research Database (Denmark)

    Rovsing, Louise; Clokie, Samuel; Bustos, Diego M;

    2011-01-01

    Cone-rod homeobox (Crx) encodes Crx, a transcription factor expressed selectively in retinal photoreceptors and pinealocytes, the major cell type of the pineal gland. In this study, the influence of Crx on the mammalian pineal gland was studied by light and electron microscopy and by use......-regulation of 745 genes (p pineal glands of wild......-type animals; only eight of these were also day/night expressed in the Crx-/- pineal gland. However, in the Crx-/- pineal gland 41 genes exhibited differential night/day expression that was not seen in wild-type animals. These findings indicate that Crx broadly modulates the pineal transcriptome and also...

  9. Reproducibility of thalamic segmentation based on probabilistic tractography.

    Science.gov (United States)

    Traynor, Catherine; Heckemann, Rolf A; Hammers, Alexander; O'Muircheartaigh, Jonathan; Crum, William R; Barker, Gareth J; Richardson, Mark P

    2010-08-01

    Reliable identification of thalamic nuclei is required to improve targeting of electrodes used in Deep Brain Stimulation (DBS), and for exploring the role of thalamus in health and disease. A previously described method using probabilistic tractography to segment the thalamus based on connections to cortical target regions was implemented. Both within- and between-subject reproducibility were quantitatively assessed by the overlap of the resulting segmentations; the effect of two different numbers of target regions (6 and 31) on reproducibility of the segmentation results was also investigated. Very high reproducibility was observed when a single dataset was processed multiple times using different starting conditions. Thalamic segmentation was also very reproducible when multiple datasets from the same subject were processed using six cortical target regions. Within-subject reproducibility was reduced when the number of target regions was increased, particularly in medial and posterior regions of the thalamus. A large degree of overlap in segmentation results from different subjects was obtained, particularly in thalamic regions classified as connecting to frontal, parietal, temporal and pre-central cortical target regions.

  10. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  11. Reproducibility of regional brain metabolic responses to lorazepam

    Energy Technology Data Exchange (ETDEWEB)

    Wang, G.J.; Volkow, N.D.; Overall, J. [Brookhaven National Lab., Upton, NY (United States)]|[SUNY, Stony Brook, NY (United States)] [and others

    1996-10-01

    Changes in regional brain glucose metabolism in response to benzodiazepine agonists have been used as indicators of benzodiazepine-GABA receptor function. The purpose of this study was to assess the reproducibility of these responses. Sixteen healthy right-handed men underwent scanning with PET and [{sup 18}F]fluorodeoxyglucose (FDG) twice: before placebo and before lorazepam (30 {mu}g/kg). The same double FDG procedure was repeated 6-8 wk later on the men to assess test-retest reproducibility. The regional absolute brain metabolic values obtained during the second evaluation were significantly lower than those obtained from the first evaluation regardless of condition (p {le} 0.001). Lorazepam significantly and consistently decreased both whole-brain metabolism and the magnitude. The regional pattern of the changes were comparable for both studies (12.3% {plus_minus} 6.9% and 13.7% {plus_minus} 7.4%). Lorazepam effects were the largest in the thalamus (22.2% {plus_minus} 8.6% and 22.4% {plus_minus} 6.9%) and occipital cortex (19% {plus_minus} 8.9% and 21.8% {plus_minus} 8.9%). Relative metabolic measures were highly reproducible both for pharmacolgic and replication condition. This study measured the test-retest reproducibility in regional brain metabolic responses, and although the global and regional metabolic values were significantly lower for the repeated evaluation, the response to lorazepam was highly reproducible. 1613 refs., 3 figs., 3 tabs.

  12. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  13. Reproducing American Sign Language Sentences: Cognitive Scaffolding in Working Memory

    Directory of Open Access Journals (Sweden)

    Ted eSupalla

    2014-08-01

    Full Text Available The American Sign Language Sentence Reproduction Test (ASL-SRT requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects’ recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies in the absence of linguistic knowledge. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are

  14. Broad spectrum antiangiogenic treatment for ocular neovascular diseases.

    Directory of Open Access Journals (Sweden)

    Ofra Benny

    Full Text Available UNLABELLED: Pathological neovascularization is a hallmark of late stage neovascular (wet age-related macular degeneration (AMD and the leading cause of blindness in people over the age of 50 in the western world. The treatments focus on suppression of choroidal neovascularization (CNV, while current approved therapies are limited to inhibiting vascular endothelial growth factor (VEGF exclusively. However, this treatment does not address the underlying cause of AMD, and the loss of VEGF's neuroprotective can be a potential side effect. Therapy which targets the key processes in AMD, the pathological neovascularization, vessel leakage and inflammation could bring a major shift in the approach to disease treatment and prevention. In this study we have demonstrated the efficacy of such broad spectrum antiangiogenic therapy on mouse model of AMD. METHODS AND FINDINGS: Lodamin, a polymeric formulation of TNP-470, is a potent broad-spectrum antiangiogenic drug. Lodamin significantly reduced key processes involved in AMD progression as demonstrated in mice and rats. Its suppressive effects on angiogenesis, vascular leakage and inflammation were studied in a wide array of assays including; a Matrigel, delayed-type hypersensitivity (DTH, Miles assay, laser-induced CNV and corneal micropocket assay. Lodamin significantly suppressed the secretion of various pro-inflammatory cytokines in the CNV lesion including monocyte chemotactic protein-1 (MCP-1/Ccl2. Importantly, Lodamin was found to regress established CNV lesions, unlike soluble fms-like tyrosine kinase-1 (sFlk-1. The drug was found to be safe in mice and have little toxicity as demonstrated by electroretinography (ERG assessing retinal and by histology. CONCLUSIONS: Lodamin, a polymer formulation of TNP-470, was identified as a first in its class, broad-spectrum antiangiogenic drug that can be administered orally or locally to treat corneal and retinal neovascularization. Several unique properties

  15. On the reproducibility of science: unique identification of research resources in the biomedical literature

    Directory of Open Access Journals (Sweden)

    Nicole A. Vasilevsky

    2013-09-01

    Full Text Available Scientific reproducibility has been at the forefront of many news stories and there exist numerous initiatives to help address this problem. We posit that a contributor is simply a lack of specificity that is required to enable adequate research reproducibility. In particular, the inability to uniquely identify research resources, such as antibodies and model organisms, makes it difficult or impossible to reproduce experiments even where the science is otherwise sound. In order to better understand the magnitude of this problem, we designed an experiment to ascertain the “identifiability” of research resources in the biomedical literature. We evaluated recent journal articles in the fields of Neuroscience, Developmental Biology, Immunology, Cell and Molecular Biology and General Biology, selected randomly based on a diversity of impact factors for the journals, publishers, and experimental method reporting guidelines. We attempted to uniquely identify model organisms (mouse, rat, zebrafish, worm, fly and yeast, antibodies, knockdown reagents (morpholinos or RNAi, constructs, and cell lines. Specific criteria were developed to determine if a resource was uniquely identifiable, and included examining relevant repositories (such as model organism databases, and the Antibody Registry, as well as vendor sites. The results of this experiment show that 54% of resources are not uniquely identifiable in publications, regardless of domain, journal impact factor, or reporting requirements. For example, in many cases the organism strain in which the experiment was performed or antibody that was used could not be identified. Our results show that identifiability is a serious problem for reproducibility. Based on these results, we provide recommendations to authors, reviewers, journal editors, vendors, and publishers. Scientific efficiency and reproducibility depend upon a research-wide improvement of this substantial problem in science today.

  16. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    Science.gov (United States)

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  17. XMM-Newton and Broad Iron Lines

    CERN Document Server

    Fabian, A C

    2007-01-01

    Iron line emission is common in the X-ray spectra of accreting black holes. When the line emission is broad or variable then it is likely to originate from close to the black hole. X-ray irradiation of the accretion flow by the power-law X-ray continuum produces the X-ray 'reflection' spectrum which includes the iron line. The shape and variability of the iron lines and reflection can be used as a diagnostic of the radius, velocity and nature of the flow. The inner radius of the dense flow corresponds to the innermost stable circular orbit and thus can be used to determine the spin of the black hole. Studies of broad iron lines and reflection spectra offer much promise for understanding how the inner parts of accretion flows (and outflows) around black holes operate. There remains great potential for XMM-Newton to continue to make significant progress in this work. The need for high quality spectra and thus for long exposure times is paramount.

  18. Metaresearch for Evaluating Reproducibility in Ecology and Evolution

    Science.gov (United States)

    Fidler, Fiona; Chee, Yung En; Wintle, Bonnie C.; Burgman, Mark A.; McCarthy, Michael A.; Gordon, Ascelin

    2017-01-01

    Abstract Recent replication projects in other disciplines have uncovered disturbingly low levels of reproducibility, suggesting that those research literatures may contain unverifiable claims. The conditions contributing to irreproducibility in other disciplines are also present in ecology. These include a large discrepancy between the proportion of “positive” or “significant” results and the average statistical power of empirical research, incomplete reporting of sampling stopping rules and results, journal policies that discourage replication studies, and a prevailing publish-or-perish research culture that encourages questionable research practices. We argue that these conditions constitute sufficient reason to systematically evaluate the reproducibility of the evidence base in ecology and evolution. In some cases, the direct replication of ecological research is difficult because of strong temporal and spatial dependencies, so here, we propose metaresearch projects that will provide proxy measures of reproducibility. PMID:28596617

  19. Reproducibility of an organoleptic method for halitosis assessment

    Directory of Open Access Journals (Sweden)

    Késsia Suênia Fidelis de Mesquita-Guimarães

    2017-01-01

    Full Text Available Background: The organoleptic evaluation is considered the gold standard between evaluation methods of halitosis, but its main drawback is the difficulty of reproducibility. Purpose: The aim of this study was to evaluate the reproducibility of an organoleptic evaluation method using three levels of scores (0 = no odor, 1 = moderate odor, and 2 = strong odor to increase reliability between researchers and clinicians. Methods: The evaluation was blindly conducted by two examiners previously calibrated by the Smell Identification Test and compliance in clinical trials. Statistical calculations were done with STATA ® software version 9.0. Results: The degree of agreement between examiners was 82.5%, with estimated Kappa (κ =0.69, with substantial agreement.   Conclusion: The scale used in this study by organoleptic method was effective and reproducible but must be repeated and compared to other methods for better consistency of results.

  20. CRKSPH - A Conservative Reproducing Kernel Smoothed Particle Hydrodynamics Scheme

    CERN Document Server

    Frontiere, Nicholas; Owen, J Michael

    2016-01-01

    We present a formulation of smoothed particle hydrodynamics (SPH) that employs a first-order consistent reproducing kernel function, exactly interpolating linear fields with particle tracers. Previous formulations using reproducing kernel (RK) interpolation have had difficulties maintaining conservation of momentum due to the fact the RK kernels are not, in general, spatially symmetric. Here, we utilize a reformulation of the fluid equations such that mass, momentum, and energy are all manifestly conserved without any assumption about kernel symmetries. Additionally, by exploiting the increased accuracy of the RK method's gradient, we formulate a simple limiter for the artificial viscosity that reduces the excess diffusion normally incurred by the ordinary SPH artificial viscosity. Collectively, we call our suite of modifications to the traditional SPH scheme Conservative Reproducing Kernel SPH, or CRKSPH. CRKSPH retains the benefits of traditional SPH methods (such as preserving Galilean invariance and manif...

  1. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  2. Reproducibility of Circulating MicroRNAs in Stored Plasma Samples.

    Directory of Open Access Journals (Sweden)

    Monica L Bertoia

    Full Text Available Most studies of microRNA (miRNA and disease have examined tissue-specific expression in limited numbers of samples. The presence of circulating miRNAs in plasma samples provides the opportunity to examine prospective associations between miRNA expression and disease in initially healthy individuals. However, little data exist on the reproducibility of miRNAs in stored plasma.We used Real-Time PCR to measure 61 pre-selected microRNA candidates in stored plasma. Coefficients of variation (CVs were used to assess inter-assay reliability (n = 15 and within-person stability over one year (n = 80. Intraclass correlation coefficients (ICCs and polychoric correlation coefficients were used to assess within-person stability and delayed processing reproducibility (whole blood stored at 4°C for 0, 24 and 48 hours; n = 12 samples.Of 61 selected miRNAs, 23 were detected in at least 50% of samples and had average CVs below 20% for inter-assay reproducibility and 31 for delayed processing reproducibility. Ten miRNAs were detected in at least 50% of samples, had average CVs below 20% and had ICCs above 0.4 for within-person stability over 1-2 years, six of which satisfied criteria for both interassay reproducibility and short-term within-person stability (miR-17-5p, -191-5p, -26a-5p, -27b-3p, -320a, and -375 and two all three types of reproducibility (miR-27b-3p and -26a-5p. However, many miRNAs with acceptable average CVs had high maximum CVs, most had low expression levels, and several had low ICCs with delayed processing.About a tenth of miRNAs plausibly related to chronic disease were reliably detected in stored samples of healthy adults.

  3. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  4. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    Science.gov (United States)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  5. Reproducing cosmic evolution of galaxy population from $z = 4$ to $0$

    CERN Document Server

    Okamoto, Takashi; Yoshida, Naoki

    2014-01-01

    We present cosmological hydrodynamic simulations performed to study evolution of galaxy population. The simulations follow timed release of mass, energy, and metals by stellar evolution and employ phenomenological treatments of supernova feedback, pre-supernova feedback modeled as feedback by radiation pressure from massive stars, and quenching of gas cooling in large halos. We construct the fiducial model so that it reproduces the observationally estimated galaxy stellar mass functions and the relationships between the galaxy stellar mass and the host halo mass from $z = 4$ to 0. We find that the fiducial model constructed this way naturally explains the cosmic star formation history, the galaxy downsizing, and the star formation rate and metallicity of the star-forming galaxies. The simulations without the quenching of the gas cooling in large halos overproduce massive galaxies at $z < 2$ and fail to reproduce galaxy downsizing. The simulations that do not employ the radiation pressure feedback from youn...

  6. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  7. Some remarks about interpolating sequences in reproducing kernel Hilbert spaces

    CERN Document Server

    Raghupathi, Mrinal

    2011-01-01

    In this paper we study two separate problems on interpolation. We first give a new proof of Stout's Theorem on necessary and sufficient conditions for a sequence of points to be an interpolating sequence for the multiplier algebra and for an associated Hilbert space. We next turn our attention to the question of interpolation for reproducing kernel Hilbert spaces on the polydisc and provide a collection of equivalent statements about when it is possible to interpolation in the Schur-Agler class of the associated reproducing kernel Hilbert space.

  8. VLBA imaging of radio-loud Broad Absorption Line QSOs

    CERN Document Server

    Montenegro-Montes, F M; Benn, C R; Carballo, R; Dallacasa, D; González-Serrano, J I; Holt, J; Jiménez-Luján, F

    2009-01-01

    Broad Absorption Line Quasars (BAL QSOs) have been found to be associated with extremely compact radio sources. These reduced dimensions can be either due to projection effects or these objects might actually be intrinsically small. Exploring these two hypotheses is important to understand the nature and origin of the BAL phenomenon because orientation effects are an important discriminant between the different models proposed to explain this phenomenon. In this work we present VLBA observations of 5 BAL QSOs and discuss their pc-scale morphology.

  9. 暖温带落叶阔叶林动态变化的模拟研究%Modelling changes of a deciduous broad-leaved forest in warm temperate zone of China

    Institute of Scientific and Technical Information of China (English)

    桑卫国

    2004-01-01

    用森林动态林窗模型FORET1模拟了暖温带落叶阔叶林的长期变化特征.模型参数取自暖温带地区长期森林研究和经营的历史数据,对过去数据中缺少的参数进行了实地测定,并用观测的数据对模型作了检验.结果表明模型能较好地模拟暖温带落叶阔叶林的长期动态变化特征.通过模拟可以看出,森林的净初级生产力没有明显变化规律且极度不稳定,峰值出现在30a左右,相似于世界上其它地区森林动态格局变化,生物量格局呈循环状态变化,循环周期大致在110a左右.%A warm temperate deciduous forest in Dongling Mountain of Beijing was simulated with forest gap dynamics model,FORET1,in order to predict ist future changes.The model parameters were derived from both historical and currently measured data;and the model was tested against observed data.Results showed that the model simulation of forest species composition,biomass and production matched well with observed data.Model simulation of the dynamics of this warm temperate deciduous forest indicate that the changes in net primary production was clearly unregulated and extremely unstable,with a peak around 30 years and repeated patterns of dynamics in biomass production every 110 years.The patterns of temporal dynamics of this forest are comparable to other forest ecosystems worldwide.

  10. SNaPP: Simplified Nanoproteomics Platform for Reproducible Global Proteomic Analysis of Nanogram Protein Quantities

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Eric L.; Piehowski, Paul D.; Orton, Daniel J.; Moore, Ronald J.; Qian, Wei-Jun; Casey, Cameron P.; Sun, Xiaofei; Dey, Sudhansu K.; Burnum-Johnson, Kristin E.; Smith, Richard D.

    2016-03-01

    Global proteomic analyses are now widely applied across biological research to study changes in an organism(s) proteome correlated with a perturbation, phenotype and/or time series of interest.[1-3] Further, it has been broadly established that efficient and reproducible sample preparation workflows are crucial to successful quantitative proteome comparisons, especially when applying label free methods.[4-8] However, clinical samples are often severely limited in quantity and can preclude the application of more robust bulk sample processing workflows due to e.g. contamination, carry-over, or sample losses.[9] This has limited the effective application of global proteomics for many sample types of great interest, e.g. LCM dissected tissues, FACS sorted cells, circulating tumor cells (CTC), and early embryos. In a typical proteomics experiment, bulk homogenization is applied to generate sufficient protein for processing (> 10 µg protein), and can blend the proteomes of many different cell types and disparate tissue regions. The resulting “average” proteome, can effectively render unobservable proteome changes of interest, and preclude important applications. Global proteomic analyses of complex protein samples in nanogram quantities require a fastidious approach to achieve in-depth protein coverage and quantitative reproducibility.

  11. Ptychography with broad-bandwidth radiation

    Energy Technology Data Exchange (ETDEWEB)

    Enders, B., E-mail: bjoern.enders@ph.tum.de; Dierolf, M.; Stockmar, M.; Pfeiffer, F. [Lehrstuhl für Biomedizinische Physik, Physik-Department and Institut für Medizintechnik, Technische Universität München, 85747 Garching (Germany); Cloetens, P. [European Synchrotron Radiation Facility, 38043 Grenoble (France); Thibault, P. [Department of Physics and Astronomy, University College London, London (United Kingdom)

    2014-04-28

    Ptychography, a scanning Coherent Diffractive Imaging (CDI) technique, has quickly gained momentum as a robust method to deliver quantitative images of extended specimens. A current conundrum for the development of X-ray CDI is the conflict between a need for higher flux to reach higher resolutions and the requirement to strongly filter the incident beam to satisfy the tight coherence prerequisite of the technique. Latest developments in algorithmic treatment of ptychographic data indicate that the technique is more robust than initially assumed, so that some experimental limitations can be substantially relaxed. Here, we demonstrate that ptychography can be conducted in conditions that were up to now considered insufficient, using a broad-bandwidth X-ray beam and an integrating scintillator-based detector. Our work shows the wide applicability of ptychography and paves the way to high-throughput, high-flux diffractive imaging.

  12. Broad-band acoustic hyperbolic metamaterial

    CERN Document Server

    Shen, Chen; Sui, Ni; Wang, Wenqi; Cummer, Steven A; Jing, Yun

    2015-01-01

    Acoustic metamaterials (AMMs) are engineered materials, made from subwavelength structures, that exhibit useful or unusual constitutive properties. There has been intense research interest in AMMs since its first realization in 2000 by Liu et al. A number of functionalities and applications have been proposed and achieved using AMMs. Hyperbolic metamaterials are one of the most important types of metamaterials due to their extreme anisotropy and numerous possible applications, including negative refraction, backward waves, spatial filtering, and subwavelength imaging. Although the importance of acoustic hyperbolic metamaterials (AHMMs) as a tool for achieving full control of acoustic waves is substantial, the realization of a broad-band and truly hyperbolic AMM has not been reported so far. Here, we demonstrate the design and experimental characterization of a broadband AHMM that operates between 1.0 kHz and 2.5 kHz.

  13. Spectrophotometry of six broad absorption line QSOs

    Science.gov (United States)

    Junkkarinen, Vesa T.; Burbidge, E. Margaret; Smith, Harding E.

    1987-01-01

    Spectrophotometric observations of six broad absorption-line QSOs (BALQSOs) are presented. The continua and emission lines are compared with those in the spectra of QSOs without BALs. A statistically significant difference is found in the emission-line intensity ratio for (N V 1240-A)/(C IV 1549-A). The median value of (N V)/(C IV) for the BALQSOs is two to three times the median for QSOs without BALs. The absorption features of the BALQSOs are described, and the column densities and limits on the ionization structure of the BAL region are discussed. If the dominant ionization mechanism is photoionization, then it is likely that either the ionizing spectrum is steep or the abundances are considerably different from solar. Collisional ionization may be a significant factor, but it cannot totally dominate the ionization rate.

  14. Random sampling causes the low reproducibility of rare eukaryotic OTUs in Illumina COI metabarcoding

    Science.gov (United States)

    Knowlton, Nancy

    2017-01-01

    DNA metabarcoding, the PCR-based profiling of natural communities, is becoming the method of choice for biodiversity monitoring because it circumvents some of the limitations inherent to traditional ecological surveys. However, potential sources of bias that can affect the reproducibility of this method remain to be quantified. The interpretation of differences in patterns of sequence abundance and the ecological relevance of rare sequences remain particularly uncertain. Here we used one artificial mock community to explore the significance of abundance patterns and disentangle the effects of two potential biases on data reproducibility: indexed PCR primers and random sampling during Illumina MiSeq sequencing. We amplified a short fragment of the mitochondrial Cytochrome c Oxidase Subunit I (COI) for a single mock sample containing equimolar amounts of total genomic DNA from 34 marine invertebrates belonging to six phyla. We used seven indexed broad-range primers and sequenced the resulting library on two consecutive Illumina MiSeq runs. The total number of Operational Taxonomic Units (OTUs) was ∼4 times higher than expected based on the composition of the mock sample. Moreover, the total number of reads for the 34 components of the mock sample differed by up to three orders of magnitude. However, 79 out of 86 of the unexpected OTUs were represented by important. Our results further reinforce the need for technical replicates (parallel PCR and sequencing from the same sample) in metabarcoding experimental designs. Data reproducibility should be determined empirically as it will depend upon the sequencing depth, the type of sample, the sequence analysis pipeline, and the number of replicates. Moreover, estimating relative biomasses or abundances based on read counts remains elusive at the OTU level.

  15. Spatial aspects of reproduced sound in small rooms

    DEFF Research Database (Denmark)

    Bech, Søren

    1998-01-01

    This paper reports on the influence of individual reflections on the auditory spatial aspects of reproduced sound. The sound field produced by a single loudspeaker positioned in a normal listening room has been simulated using an electroacoustical synthesis of the direct sound, 17 individual refl...

  16. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Science.gov (United States)

    Szitenberg, Amir; John, Max; Blaxter, Mark L; Lunt, David H

    2015-09-01

    The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  17. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  18. On the reproducibility of meta-analyses : six practical recommendations

    NARCIS (Netherlands)

    Lakens, D.; Hilgard, J.; Staaks, J.

    2016-01-01

    Meta-analyses play an important role in cumulative science by combining information across multiple studies and attempting to provide effect size estimates corrected for publication bias. Research on the reproducibility of meta-analyses reveals that errors are common, and the percentage of effect si

  19. ON APPROXIMATION BY SPHERICAL REPRODUCING KERNEL HILBERT SPACES

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The spherical approximation between two nested reproducing kernels Hilbert spaces generated from different smooth kernels is investigated. It is shown that the functions of a space can be approximated by that of the subspace with better smoothness. Furthermore, the upper bound of approximation error is given.

  20. The United States Today: An Atlas of Reproducible Pages.

    Science.gov (United States)

    World Eagle, Inc., Wellesley, MA.

    Black and white maps, graphs and tables that may be reproduced are presented in this volume focusing on the United States. Some of the features of the United States depicted are: size, population, agriculture and resources, manufactures, trade, citizenship, employment, income, poverty, the federal budget, energy, health, education, crime, and the…

  1. Collecting highly reproducible images to support dermatological medical diagnosis

    DEFF Research Database (Denmark)

    Gomez, David Delgado; Carstensen, Jens Michael; Ersbøll, Bjarne Kjær

    2006-01-01

    In this article, an integrated imaging system for acquisition of accurate standardized images is proposed. The system also aims at making highly reproducible images over time, so images taken at different times can be compared. The system is made up of an integrating intensity sphere illumination...

  2. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    Directory of Open Access Journals (Sweden)

    Zainab S Al-Hosni

    2016-11-01

    Full Text Available Objectives: Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods: In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20. Data were analyzed using intraclass correlation coefficient (ICC as a method of reproducibility assessment. Results: The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78. The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037. When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420. Conclusions: The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts.

  3. Calderon type reproducing formula on spaces of homogeneous type

    Institute of Scientific and Technical Information of China (English)

    邓东皋; 韩永生

    1995-01-01

    By using the Calderon-Zygmund operator theory, a continuous version of the Calderon type reproducing formula associated to a para-accretive function on spaces of homogeneous type is proved. A new characterization of the Besov and Triebel-Lizorkin spaces on spaces of homogeneous type is also obtained.

  4. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    Science.gov (United States)

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  5. Reproducible and replicable CFD: it's harder than you think

    CERN Document Server

    Mesnard, Olivier

    2016-01-01

    Completing a full replication study of our previously published findings on bluff-body aerodynamics was harder than we thought. Despite the fact that we have good reproducible-research practices, sharing our code and data openly. Here's what we learned from three years, four CFD codes and hundreds of runs.

  6. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    Science.gov (United States)

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  7. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...

  8. Scaled–up sonochemical microreactor with increased efficiency and reproducibility

    NARCIS (Netherlands)

    Verhaagen, Bram; Liu, Youlin; Galdames Perez, Andres; Castro-Hernandez, Elena; Fernandez Rivas, David

    2016-01-01

    Bubbles created with ultrasound from artificial microscopic crevices can improve energy efficiency values for the creation of radicals; nevertheless it has been conducted so far only under special laboratory conditions. Limited reproducibility of results and poor energy efficiency are constraints fo

  9. Exploring the Coming Repositories of Reproducible Experiments: Challenges and Opportunities

    DEFF Research Database (Denmark)

    Freire, Juliana; Bonnet, Philippe; Shasha, Dennis

    2011-01-01

    Computational reproducibility efforts in many communities will soon give rise to validated software and data repositories of high quality. A scientist in a field may want to query the components of such repositories to build new software workflows, perhaps after adding the scientist’s own algorithm...

  10. Reproducibility in density functional theory calculations of solids

    DEFF Research Database (Denmark)

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn

    2016-01-01

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We r...

  11. Reproducibility of Electrical Caries Measurements : A Technical Problem?

    NARCIS (Netherlands)

    Huysmans, M. C. D. N. J. M.; Kühnisch, J.; Bosch, J. J. ten

    2005-01-01

    The currently available instrument for lectrical detection of occlusal caries lesions [Electronic Caries Monitor (ECM)] uses a site-specific measurement with co-axial air drying. The reproducibility of this method has been reported to be fair to good. It was noticed that the measurement variation of

  12. Reproducibility of electrical caries measurements : A technical problem?

    NARCIS (Netherlands)

    Huysmans, M.C.D.N.J.M.; Kuhnisch, J.; ten Bosch, J.

    2005-01-01

    The currently available instrument for electrical detection of occlusal caries lesions [Electronic Caries Monitor (ECM)] uses a site-specific measurement with co-axial air drying. The reproducibility of this method has been reported to be fair to good. It was noticed that the measurement variation

  13. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback